So how long do you think it will be before such technoloogy is widely availble to the public?
10 years? Maybe 25?
yes!!! A thread that I am waiting for!
everything you know is wrong
why do you want more than 64 bits? is addressing 16 exabytes of RAM at once insufficient for your needs?
MOAR IS BETTER!!!
Those are not numbers that just increase. They actually mean something, and if you don't know what, asking that question is meaningless.
That OP should start talking in qubits.
Seriously, if NOTHING else will run Microsoft Train Simulator at a frame rate of 30 or more..............
With the advent of the mass-produced quantum computer Microsoft and Apple will die due to technological differences - namely, the fact that these machines are infinitely superior to any other and that they operate on XNOR, which would necessitate a completely new approach to computer sciences.
The whole quantum computer thing is kind of overblown. As soon as they're going (and they're probably already going in the basement of the NSA), the support devices (cryogenics, whatever) will be very large and they'll be limited to very specific research applications. There's going to be a technology divide: the masses will still be using slightly faster variants of the machines we use right now while CERN will be using quantum computers. Even in the distant future, people will still be doing Internet/word processing on non-quantum computers because why would you need such a thing in the house? Think about it: there are machines in the world that can freeze helium solid, but all I need is a refrigerator keep my beer cold. Anyway, I think your investments in major technology firms are pretty safe for many decades to come.
Unless we drop the client/server model in computing and switch back to mainframe/thin client
The whole electric typewriter thing is kind of overblown. As soon as they're going (and they're probably already going in the basement of the War Department), the support devices (vacuum tubes, whatever) will be very large and they'll be limited to very specific research applications. There's going to be a technology divide: the masses will still be using slightly better variants of the machines we use right now while MIT will be using punch cards. Even in the distant future, people will still be doing Mail/word processing on non-electric typewriters because why would you need such a thing in the house? Think about it: there are machines in the world that can freeze water solid, but all I need is an ice box keep my cider cold. Anyway, I think your investments in the Dow Jones Industrial Average are pretty safe for many decades to come.
Yeah, I know I may eat my own words, but whatever. We still don't have flying cars or use furniture that you clean with a hose.
Also, you actually edited that entire post instead of just using search/replace. GG!
> Also, you actually edited that entire post instead of just using search/replace. GG!
ORIGINAL CONTENT, DO NOT STEAL!
Optical computing is likely to be the next step, when they reach the physical limit to the size of transistors.
If we can actually make a quantum computer, it won't be for a while - commercially even longer.
As far as 64-bit OS's are concerned, the bit size, among other more important details, is an indication of how large the maximum RAM a computer can have/use without an address extension.
32 bit computers can have a maximum of 2^32 bits, or 4 GB, of RAM. We have gotten to the point where 4 GB is easy to use and get - this is where the 64-bit OS comes into play.
A 64 bit OS can use up to 2^64 bits of RAM, or 16 exabytes of ram (roughly 17.2 billion GB). Excepting the usefulness of calculating with very large numbers efficiently, we won't need more than 64-bits until we can use over 16 exabytes of RAM.
Heh, kind of reminds me about how Bill Gates said something like 32 megabytes of hard drive space is more than we'll ever need.
>>19
True. The rest is useless junk for tree-dee vidyagaems and jewtoob and colour graphics. We don't need it. But it makes life easier for the non-blind of us.
I'm not much of an electrician or mathmatician, but couldn't an OS be built on something between the complexity and economy of 32 bit and 64 bit processing? 48bit?
Inevitably we will see great reduction in efficiency and move past 3D as we passed 2D. But do we really need new processers (v. more) to unlock dependable holos & VR?
48 isnt a power of two, cant have that ! :P
>>28 Fine then, but couldn't you just weave a microchip to avoid silly hardware mathematic mindgames? ª¿ª
But what if I want to run a simulation that's so realistic the characters in it think that they're real?
That's why we need 1024 bit processors and 256 bit OS!
Go ahead and talk a hardware manufacturer into mass-producing a 1024-bit processor. Then get a software company to make a 256-bit OS to run on it. I'd like to see one of these bad boys in action.
Considering the latest advancement in the field of optical computing consists of one very large chip with about the same processing power of a handheld calculator, and considering that to be used, it has to be cooled to 0.01 degree above absolute zero. I think the advent of mass-produced quantum computers is not something we need to be concerned with just yet.
i cant count to 2^256, let alone 2^1024 -_-
Do we really need such accurate calculations for that? I would rather go for number of cores counted in dozens ;) We don't need accurate numbers but speed and having more bits won't do it. And even then you can have seperate processors with own memory just communicating each other as different parts of brain do.
And BTW characters can think that they're real even if they would be 2D or even if they would have some very simplistic senses and wouldn't have any virtual body at all ;) And this would be much more interesting than simulation of real world :)
>>28
I know some things of programming and you're talking BS. Turing-completeness is not sufficient to simulate thought. You can have precision arithmetic done with an 8-bit CPU too. Bignum libraries don't care for the machines word width. It has more to do with how much memory you have available rather than the size of the registers of the processor.
>>24
And then we could all be gods. Actually I'm quite fond of that idea, being able to control a simulated world full of little virtual creatures that evolve and adapt to better survive the virtual nature using their A.I. and then fucking with them for a couple hundred years while they're still wearing virtual rags, then leaving them to figure things out for themselves while I sit back and watch the fireworks, and maybe after a couple thousand simulated years I'd reintroduce my presence in their man-made reality and delete them all. Rinse, repeat.
So you say you know some things of programming? You're not the only one :P Give me other proper reason to have 1024bit processor. Unless of course you want to have matrix multiplications done by it.
Of course you can have precise calculations as accurate as you want even on 8-bit computer but if you want them to be as fast as possible you'd go for processors doing them with one command, ie. with 64-bit word you can have more precise calculations done in same time as with 32-bit but being more accurate. And other thing is memory but even then, you can have huge memory available even on 8-bit system - it will be tricky of course.
"Turing-completeness is not sufficient to simulate thought". I think that even people who don't know what Turing test is, would say that :P But then, tell my what thought is? And what about consiousness? What do you think is needed to use term "thinking machines"?
If machine will think (analyse, combine facts etc) about itself, about world surrounding it, will even have some avatar and will say "I know that I am. It is me. I do things that alter world and I regard myself as unique entity, though I can be copied". What then? Will you look into it's database/code and then tell that it is just a simulation? Then what?
And BTW, I wonder how can people convince other people that they are consious ;)
Instead of having greater bit sizes we should work on clock speed and core temperatures. I would like to have a nice 6.0 GHz processor without the need for Liquid Nitrogen. Plus 1024 bit Machine Code would be such a bitch to debug, or even write an assembler for. I have enough problems attempting to read 32, 64 and 128 bit instructions.
they have 128 bit programs now?
I thought we were still stuck at 64.
Why do we have to discrimnate based on simulation terms? Is it a murder to have a simulated entity? No, wait I know what it is ... some people have way too much ego, and are afraid to acknoledge, share, talk or even consider the presence of other types of entities.
And what's with the convicing part? it's like you're having an obsession about convictions, reminds me of religion.
256-bit? 1024-bit?
256-bit OS would allow you to address up to 115,792,089,237,316,195,423,570,985,008,690,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 bytes of RAM. (Rounded; That's 115 quattuorvigintillion)
... and a 1024-bit OS would allow up to 179,769,313,486,231,590,772,930,519,078,900,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000,000 bytes of RAM. (Obviously rounded; That is 179 uncentillion)
I don't see us needing that kind of computing power within our lifetime. Even the next step up to 128-bit would increase potential to 281,474,977,000,000 yottabytes.
64-bit will be with us for a while to come.
That's more than a google! The 1024 bits would be for the CPU, not the OS
*googol
No, Microsoft are developing Windows 8 which is supposed to come out 1st July 2011. There are rumors that there will be a 128 bit version
>>38
you do realize how long it took for there to be a 64-bit version of windows after 64-bit hardware was widely available, right? i wouldn't expect to see a 128-bit version of windows until at least 2015, if at all.
we have 2054 bit processor here in drdo india;) its a super compuetr;)
It never hurts to overdesign. The speed of the processors have pretty much hit fundamental elemental barriers. Make the chips capable of processing 4 simultaneous 128 bit instructions in one cpu cycle will speed up number processing for things that need it. Such as any of the BOINC projects. instead of taking months to process a genome or a climate projection it could be done much more rapidly. the amounts of data that are currently available on the internet is only going to be the tip of the data iceberg. Take the utilities to make 32 bit code run in 64 bit and use that to run something that needs 512 with data redundancy. take your pick designing the chip that runs at 1ghz and is 1024bit wide but consumes virtually no power and runs at room temp without cooling is much more valuable than designing a 10ghz chip that needs a factory sized cooling device. just my two cents on this one.
more stars out there than the number of grains of sand you would think we might eventually need a computer capable of processing them all at once....now thats something to work towards.
There is little reason to add bits to word size. 64 bits give you any reasonably big integer value, any reasonably precise floating point number, and any reasonable address in memory. Adding bits to that has little effect, it just extends the range to the 'less reasonable' area. Yes it can be useful in certain applications but it won't bring effect in most.
What's interesting is having the computer to process them faster or more of them at once and because increasing the frequency is getting increasingly hard due to speed of light limit, adding CPU cores and massively parallel units such as GPUs is the current way to go.
>>42
What is reasonable in 2010 may not be a few years down the road. And it's always later than you think.
"No one should ever need more than 640k." --Bill Gates, 1981
As we develop artificial intelligence we will need more computing and ram to process data. A 1024-Bit System is only withing 15 years.
A 256-bit system will essential and faster speed to computing graphical animation rather than writing in temporary files and reading the images separately.
DVD of "THE COVE" is being distributed free of charge in Sea Shepherd now.
Please contact the application method each mail address described in following URL.
http://www.seashepherd.org/contact/general-public.html
Sea Shepherd has already begun serving in Japan and each country.
You also must apply.
Well, Skynet is supposed to take over April 21, 2011. The war is supposed to rage until 2029 when we destroy Skynet's master control. SO, you scientists and technogeeks need to stop posting random thoughts and GET BACK TO WORK! Be a part of the RESISTANCE, not the problem! ;)
We already have Multiple Instruction Multiple Data processors. I really doubt increasing instruction bit size would help anything.
No need for 128, 256,512,1024 OS and compatible CPU?
Well the biggest example you have,you are using right now. The Internet. Runs mostly on a 32 bit IP address, but now we need a bigger address scheme, hence 64bit was introduce. It was said we would never need more IPs than the 32 bits could address, but yet here we are. If not for NAT the Internet working would have crash years ago.
We can't see the need for 128 or even a 256 right now but the need is there just a little ways down the road. Get your binoculars and look a little further ahead.
It is true however, a 128 or higher OS will not bring faster computing at least not very noticeable anyway. What we need is a complete and utterly new computer system i.e. new bus capable of transfer data near the speed of the 128 bit CPU, storage near that speed as well. In other words the hardware needs to match or approach the speed and performance increase gain by a 128 bit or higher.
This is a no brainer
We have 7 billion people on this planet only and a 32 bit OS can only address 4 billion adresses
We know that we will need a 1024 bit OS and processor eventually
and not something bigger in this universe,as the number of atoms in this universe is contained in this 2^1024 number
So why not build it now?
we skip the unpleasans 32 to 64 bit transitions like in the future
>>1
Apparently, before 2008: http://en.wikipedia.org/wiki/Efficeon
...And no one noticed.
a 256 bit processor would be amazing for AES encryption since the main step involves a 256 bit state matrix.
"256 bit" means that there are 256 numbers that a computer can read. most computers now can read up to 32 bit numbers, or numbers with 32 digits in them. there are also 64 bit computers, which can read 64 digits. for a better explanation, head over to Numberfile on youtube.
Nobody should need more than 640 bits.
>>55
Proof that if you are going to fail, you might as well fail superlatively.
>>60
this thread has been on this board for about 5.5 years. why move it now?
hey if you get satisfied with this how can you be called science lover .
there is no certanity that homo sapiens want more than they got in every field if they want more if they want to store their memories in full hd from their mind to hard disk where is the problem real track deployment videography should be an alternative
>>59
Except an atom is not the smallest unit, so that's as arbitrary as saying "the universe is made out of 2 halves, so 2 bits should be enough for everyone!"
I think it would be separated that the PCs for citizen and the PCs for (Technical) Workers.
People not need such a High Speck Computers for their daily laisures.
It can be required the thread for this board?
It may be fundamental to mention that the ultimate entropy of the Universe is bound by the area of the de Sitter horizon in Planck units. The radius of the horizon is about 1060 Planck lengths so the area is 10120 Planck areas. The largest entropy that our Universe may carry is therefore about 10120 bits.
Wow look how old this post is! It's almost been a decade is still getting pushed!
So how is that prediction going about the 128 bit pc may happen thing?
>>67
Risc-V looks promising on that front, with support for a 128bit address space. And as well as that it the ISA free and open which is always a plus.
I'd be more interested in a 128-qbit quantum computer.
There technically are 512 bit cpus but they arent rrally exciting.. the intel xeon phi, i think its only the floating point par that is 512 but so just numwber crunching for science
>>64
Consumer desktops and laptops are still being sold with 4 or 8 GB of RAM (every so often you see one with 12 GB of RAM these days) and neither Windows nor standard software has gotten bloated enough to render this insufficient. Meanwhile I use a computer with 32 GB of RAM and an 8 core processor in my laboratory for sequence alignment. Moore's law stopped working as advertised in 2005, so I expect that this state of affairs will continue until either a)quantum computing comes along and upsets things or b)technical progress in computers finally comes to a halt and information technology becomes another ossified industry that was once cutting edge, like cars or petrochemicals. The way things are looking now, I think that quantum computing will be developed the year after they figure out controlled nuclear fusion and world peace.
At this point even an 8 Qbit (or a full quantum byte) computer would be good. From what I heard the max they have achieved so far is something like 5 Qbits or some other odd number.
It doubles about every 10 years
1980 - 8 bit
1990 - 16 bit
2000 - 32 bit
2010 - 64 bit
2020 - 128 bit
2030 - 256 bit
2040 - 512 bit
2050 - 1024 bit
2060 - 2048 bit
2070 - 4096 bit
2080 - 8192 bit
>>74
https://en.wikipedia.org/wiki/Timeline_of_quantum_computing#2016
> Google, using an array of 9 superconducting qubits developed by the Martinis group and UCSB, accurately simulates a hydrogen atom.
and there's this nonsense: http://www.dwavesys.com/press-releases/d-wave-systems-breaks-1000-qubit-quantum-computing-barrier
>>76
and now d-wave has built an even bigger fake quantum computer ("2000 qubits").
That would be lots of data! xddd