goodbye binary?
one other semi-related comment, I read an article back in '04 about programmable processors, a technology nicknamed spintronics.
The basic idea behind it is that rather than using a charge to flip an electron from on to off, thus creating heat due to the arc or flip, a magnet is used instead to control its phase, and the phase is then measured. In 2004, they were able to accurately measure the phase to a precision of 16 positions. Using the magnetic field to flip it reduces heat, and measuring its phase in positions, rather than checking if it's on or off, obviously leads to more powerful processors consuming less power and space.
Another benefit to it was that processors could then be programmed on the fly. The article I originally read stated that this would be extremely attractive to cell phone mfgrs. It went on to say that a specific cell phone company was going to try to test it in a phone in Europe. Based on what I read today, that doesn't seem to have happened, at least not in 2004. But the article I listed at the bottom in #4 does state this is more easily achievable in "gallium arsenide, which is used in cell phones."
Anyway the whole point behind this ramble is that the first thing that came to my mind was that if a chip can measure to 16 positions at that time, and presumably many more down the road, what does that do to binary?
Originally that made me think ok, if processors can measure to 16 positions, can we really make binary obsolete, or at least not the underlying technology? The data storage seemed to be an obstacle to me.
But reading now that spintronics is currently being explored for new data storage methods leads me to believe that processors and storage devices would be able to both operate on the same model.
In other words, am I presuming too much to think that we're not too far off from ditching binary for the most common things, that processing and storage are soon no longer limited by 1s and 0s? Sure there will always be the true/false, or as TheDailyWTF displays sometimes true/false/maybe or true/false/reallytrue/reallyfalse, but can we eventually get away from that?
That's a really long ramble to say that hopefully in the future, this problem of my 0.09 != the computer's 0.09 in a floating POV goes away.
Some links and more info:
- Wikipedia has an article on it stating its currently being explored in the design of new data storage methods
- IBM has a chip history for 2004 with references about it here: http://www.ibm.com/developerworks/library/pa-yearend.html
- I believe the article I first read was in Scientific America and I know I saved it; I remember the cover said something about 10 Einstein theories, and where we are today, but I can't find it anywhere. This may be the article (not sure, I'm too cheap to buy it) because the pics on the left side look familiar... pic #4 shows the concepts of the different layers:
http://www.sciam.com/article.cfm?id=0007A735-759A-1CDD-B4A8809EC588EEDF
- Article on the status in '07, showing a silicon chip capable of crudely measuring it:
http://www.sciam.com/article.cfm?id=spintronics-breaks-the-silicon-barrier