http://www.engadget.com/2008/12/19/ibm-claims-title-of-worlds-fastest-graphene-transistor/ It's true and IBM have done it. And they claim this new chip technology can reach 100Ghz. So one day in the future 100Ghz or maybe 1Thz might really be a posibility in our home computer
nein, i'm guessing it'll only bump it up to 10ghz or so. the limits of an individual silicon transistor is about 40Ghz, but when you have a billion of them on a chip, you can only push about 4ghz reliably.
We are not talking about silicon transistors here. IBM made a graphene transistor. So it's in a totally different ballpark to the silicon ones.
re-read my post. and your post, too. it's a legit comparison. how on earth do you expect a full processor to scale UP in frequency compared to a single transistor? a complex processor won't be faster than a single component, no mater what it's made of. silicon, graphene, vacuum tubes, whatever. if the graphene transistor can hit 100Ghz, then don't expect a full graphene processor to hit much more than 10Ghz.
You make one assumption. That you know the future and how it wll unfold. You do not. And I do not. Yes what you said is a good guess based on past events. But we will wait and see if your prediction is correct. Only time can tell. And what would it be used for? At the moment nothing would utilise 100% of it's potential but in the future something might use up all it's processing power.
I highly doubt it couldn't be fully utilised today. There's that network where private computers help process NASA information I can't remember the name of. There for example. And then you have hacking where all the possibilities need to be tried so yes, a 10GHz computer would more than fit in well today. You just have to think outside gaming
I was thinking more in the line of realtime video rendering and 3D image creation for movies and science. And to me that's dependant as much on the GPU as the CPU. Sure a 10Ghz computer would be easily used. As you said science or hackers or maybe Pixar's latest movie or on the next gen Blizzard game. But something like a 100Ghz anything, I dunno.
People said things like that 12 years ago once they'd set themselves up with a 200MHz Pentium. In 12 years time the thing your using now will be called all sorts of bad names. People's judgements are all relative to whatever is cutting edge. There are a lot of computational problems that could benefit from a 100GHz processor. For example in computer graphics: radiosity illumination could be achieved in real time. This kind of graphics model is more advanced than the phong based models used on the graphics card at the moment. Also, physics. Modeling the physics of arbitrary three dimensional objects is more difficult than a load of spherical balls. It takes a lot more cpu time. Also consider liquids. Think of objects splashing in water with any overspill running downhill and soaking into the ground. Imagine the impact creating ripples on the surface, and things floating and bobbing around with the disturbance. Things that are achieved with simulated physics, not just faked. Or consider a flight simulator. How about a proper simulation of the air surrounding the craft, with fluid mechanics? How about a game where you could design/modify your own aircraft and see how they fly and maneuver, using real world physics? Also, AI could be improved for computer players. You know how powerful those chess computers are? Unlike the AI in computer games, these actually think about how the game could evolve, and act to minimise the risk of failure. Even with online multiplayer gaming, there are monsters your group take on that are computer controlled, which could be made to appear intelligent, and capable of defeating you through clever moves, that give the impression of being a hardcore player with years of experience -- something far more worthy of respect than the high hitpoints and damage used today.
it's not an assumption. you don't need to predict the future to know that a large complex system cannot operate faster than its simplest components. this goes for everything from computing to biology. you can't have streetlights changing in less time than it takes for a single car to drive across the street. you can't make a plane take off in less time than it takes for a passenger to walk to their seat. the brain can't fire off signals at a higher frequency than an individual neuron can handle. a 100Ghz transistor will NEVER give rise to a processor that operates at 100Ghz. i'm not saying that 100Ghz processors will never exist, but if they do, they will probably require technology for 1 Thz transistors to build them. as for uses, i can imagine TONS. ever watch sci-fi? - NATURAL language processing. i have no idea how much processing power would be required to understand tones, inflection, slang, context, accents, all in real-time but it's certainly much more than what we have now. probably the next biggest revolution in interfaces, ever since the GUI. - intent processing. imagine something like this: http://weburbanist.com/2009/10/17/amazing-program-turns-sketches-into-photo-montages/ except it goes for just about everything. just give your computer an image, tell it "except i want to see it in red." and it will understand it, and edit it perfectly. also, maybe try that with video. - home processing servers. many homes these days already have more than 1 computer. the lag within a single house would be pretty small, so even if a single person won't need all that processing power, it could easily serve a full family (as well as automating some house functions) with a single processor, and only needing cheap, small terminals for each person.
Ha, games started all that upgrade madness. Games will continue that upgrade madness. :yes: The game developers ain't done yet!