Bad:
1. Testing every compile would take forever. (Hopefully we can eliminate the need for testing every compile with a algorithim we can trust.
)
2. Knowing that control flow is the same regardless of the binary difference, security will not be enhanced greatly. (Perhaps it would slow down key-gens etc.. )
Good:
1. Creation of key-gens etc should be slowed
2. Could develop into a new way to distribute bios (yet again hopefully with a algorithim that can be trusted to do so
otherwise it would be chaos)
3. A way for Microsoft to release new software that doesnt do anything different for more $
I can see the ads now.. lol
I have thought of many ways to approach this situation.
1. Program that uses the algorthim and rearranges the binary of another program/file etc. Poses the problem of creating an omni program
2. A control or library for the algorthim that programs could use for different binary compiles. We still need time to research how vb.net (testing compiler) actually takes code and does what with what at what time rofl did that make sense? Its very hard to try to fool the compiler i dont know if it can even be accomplished.
3. Have a in-memory program that reads what the compiler is doing when it compiles the program and copies it and rearranges the data and outputs its own file using the algorithm to randomize etc.. Crazy idea involving alot of bug-prone areas (yes thats what we need in this project more random bugs lol)
4. Tried thinking of a way to have the program itself at runtime change its own code or an external file its dependent on binary wise. Not a bad idea but causes alot of confusion and I dont think its possible to change binary data in the ram. I believe thats write over material and cannot be changed "directly" Kinda like changing a text file on a HD it cannot be done on the RAM. Or can it ?