Page 1 of 5 12345 LastLast
Results 1 to 15 of 66

Thread: Enter the Age of 64-bit Processing

  1. #1
    Join Date
    Jun 2001
    Location
    Laurelind�renan
    Posts
    4,382
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default Enter the Age of 64-bit Processing

    How does everyone feel about the upcoming 64-bit processor war that should begin late next year? Thoughts and opinions only please. I don't want to see any replies that read something like "Itanium rules!" or "AMD is the s**t!".
    <img src="http://anime.emuparadise.org/sig.jpg" border="0">

  2. #2
    Join Date
    Jun 2001
    Location
    Split, Croatia
    Posts
    533
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default

    as J.C. Dvorak said - "we dont need super terahertz processors to run microsoft word". until applications adapt to 64-bit technology, if will be useless to get one of these processors, if you dont think to start some high-end server based on 64-bit multiprocessing platforms.

    microsoft, as leader productor of operating systems, needed whole six years to fully adapt to 32-bit technology. until xp was released, microsoft's all-in-one operating systems (95, 98, ME and XP, NT4 and 2000 are specialized os's) lagged speed and reability because of relaying on old 16-bit core. of course, operating systems of that age had greater possibilities, they had partial 32-bit support. that will be the case with 64-bit operating systems. 75% of 32-bit, 25% of 64-bit, faster, more efficient and capable, but not toleratable of investment.

    there are full 64-bit systems like sco-unix, hp-ux and solaris. try running your favorite game or tweaking application on them, if you have the guts.

    hardware areas. i will give my full side to intel (as always), because they are in 64-bit market some amount of time now. their first 64-bit processor for desktops and workstations was itanium, but they produced 64-bit specialized processors since 1988, witch had RISC instruction set and algorhytms instead of standard x86 sets. amd is new to this one.

    why going in full 64-bit? pentium 4's have several 64-bit instructions that are really smoothing down the hard work if the apps know how to use them. latest release of 3d studio max, bryce, adobe photoshop and new visual studio.net have support for it, and they are running 30% faster with that support inside. i think that next logical step would be spreading range of 64-bit instructions inside 32-bit core.

    and war...as always with breaktrough new technologies, it will cost. the thing is upgrading to new os is cheap, to test it and see if it works well to suite your needs. but getting a whole new 64-bit system would be very expensive, just to see how it works. i'll just wait...
    Intel is up, AMD eats shit

  3. #3
    Join Date
    Jun 2001
    Location
    Laurelind�renan
    Posts
    4,382
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default

    Isn't x86 poorly designed? I heard that Intel wanted to make up for it with their new 64-bit processor. I don't know what it is all about though. I do know that Intel has been working on the new one since 1993 and they announced then that it would take several years. In fact, it took a decade and now it is almost here. I don't plan on completely converting to 64 or even at all. I don't think it will be much of a "war" until months after we see hype about AMD's and Intel's technology. So it will probably be 2004 before it hits mainstream and Steven, the Dell guy, starts saying, "Dude, your gettin' a 64-bit computing solution..". Just kidding.
    <img src="http://anime.emuparadise.org/sig.jpg" border="0">

  4. #4
    Join Date
    Jun 2001
    Location
    Split, Croatia
    Posts
    533
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default

    magus all intel's cpu-s from 8086 to itanium-II and xeon mp have x86 core. x86 core is huge if compared to risc, but risc is much faster in simple operations (like cellphone and simpler pda processors). risc stands for "reduced instruction set computer", and has completly different algorhytms to x86. if intel didnt stop naming processors in form of core and age, like 2(86),3(86),4(86), and they stopped that since first pentium arrived (because of legal trademarks, they couldnt set it on three-number name), pentium 4's would be 886 processor.
    Intel is up, AMD eats shit

  5. #5
    Join Date
    Jun 2001
    Location
    Split, Croatia
    Posts
    533
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default

    and about the second thing, intel and hewlett-packard started working together on itanium processor in 1993-94. intel was about to make the cpu and chipsed, while hp was making specialized os based on unix kernel for it, the hp-ux. itanium has proven to be very fast and reiable in 64-bit instructions, but market didnt accept him because of crappy compatibility for old 32-bit instuctions (ironic, ain't it...)
    Intel is up, AMD eats shit

  6. #6
    Join Date
    Mar 2001
    Location
    India
    Posts
    7,497
    Thanks
    32
    Thanked 304 Times in 122 Posts
    EP Points
    885

    Default

    I'm still pretty impressed by the Itanium's performance. And we DO have 64 bit WinXP, just not the right apps to utilize the power in it.
    Even then, 64 bit means a LOT more in memory bandwidth and well, that would mean, speed, and ofcourse, we all know the need for speed
    But personally, I still feel that the home user doesn't need 64 bit, come on, thats crap as far as a guy sitting on his desktop and chatting on AIM. He knows nothing of 64 bit and never will and it's true that buisnesses don't need 64 bit (for normal word processing etc). Only high end workstations and servers which earlier used Xeon or Solaris will have something new and great. To make 64bit the processor standard ? I don't think thats a very wise decision. With high manufacturing costs and not much increase in productivity, I wouldn't be ready to buy a 64 bit processor in 2006. And by then, we'd probably have 128bit.
    So those are my two cents on the subject! And RISC rules! Come on, our video games use em, and we love videogames

  7. #7
    Join Date
    Mar 2002
    Posts
    407
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Talking 2 people conversation... :)

    Gan, again you come up with your ms word phrase, its ok nothing wrong. BUT, hell the t1 model ford does a nice job of getting people around but you don't see anyone using them anymore, at least non-collectors.

    As far as regular home users not needing 64bit, well people really don't need a gforce4 ti 4600 but people still get them because it's the latest and greatest. Besides the only people that are going to be getting these chips will be people like me, gan and you masj, at first anyway. And don't tell me neither of you aren't going to get it because i'll slap you until you stop lieing to your selfs.

    As far as the x86, its still going to be the leader in processor make because of the compatability it has with all of today's hardware I don't know how we are going to get rid of the x86 insturctions in future cpu's but your right they need to get rid of it. The way things are moving in the computer buisness i'm amazed that x86 is still here. I wish we had 1024bit cpu's , y? just to have that kind of processing power under the hood of my atx tower.


    Graphics, and 3D polygons don't make great games, story-lines do!!



    Grantu2

  8. #8
    Join Date
    Mar 2001
    Location
    India
    Posts
    7,497
    Thanks
    32
    Thanked 304 Times in 122 Posts
    EP Points
    885

    Default

    I understood what you meant by your example but it's not quite in sync with the need for 64bit processing . What I mean is that people get GeForce4s because there ARE games that THEY WANT TO PLAY which work better with it and look better too..
    But there aren't any apps, yet which need/utilize 64bit processing.

  9. #9
    Join Date
    Jun 2001
    Location
    Laurelind�renan
    Posts
    4,382
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default

    Like I said, I seriously doubt it will become the standard over the next few years but 64 bit, though not new, will become a part of even a novice users speech after awhile and it could become the standard. It depends on how the processors are marketed I suppose. Thanks for correcting me Ganon.
    <img src="http://anime.emuparadise.org/sig.jpg" border="0">

  10. #10
    Join Date
    Mar 2002
    Posts
    407
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default my 2cents in again.

    I was just watching "eye drops" on techtv and I figured who would really needed this kind of processing power, computer animators. Remember it cost sony 300+million dollars to make the ff movie well with 64bit processors, instead of having like thousands of computers render all those frames in the movie sony could of only used a couple hundred. This would bring cg in movies, cost less and look better. Also all home animators could benfit from a 64bit processor. Imagine games would have actual characters look like the people of the final fantasy movie because our processors could kinda-of handle it. We could have true real time voice recongition, and have true real time picture/video recongition. So taking what I said back about only geeks like me getting a 64bit first, I think all gamers would like actual game characters that look like the videos cutscene's like on all those FF games. OMG that would be so tight.


    Graphics, and 3D polygons don't make great games, story-lines do!!



    Grantu2

  11. #11
    Join Date
    Jun 2001
    Location
    Laurelind�renan
    Posts
    4,382
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default

    Yeah, games are moving towards a photorealistic look. They will probably be there in five years or sooner. Just look at Doom 3. That game is already making leaps and bounds and we still do not know much about it. I do know that it will revolutionize game engines just like the original Doom created it. The renderer should be next-gen too. This is also a bad thing. If games become too realistic looking, the industry will be under fire. I can see ESRB battling now with developers because the games will look like a real movie or something. Imagine a 9 year old getting hold of an action game like Max Payne that is photorealistic. Not only would it scare the shit out of him but might leave damaging physcological affects. I still would like to see it though, grantu.

    By the way, you get TechTV? A city near me does but we do not. I really want that channel damnit.
    <img src="http://anime.emuparadise.org/sig.jpg" border="0">

  12. #12
    Join Date
    Mar 2002
    Posts
    407
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Talking :)

    Yup, I got techtv. Everyone that has satellite gets it, I have a directv system.

    I could only imagine photorealistic games But your right the public would have fits on photorealistic violent games, shooot they complain now about "reality" games like the tom clancy/redstorm games, they would have a field-day with the photo-realistic games. I would really love actual virtual reality head sets and body gear though for games that are photorealistic, but knowing me I would probably use that gear the same way the tech guy in "Time cop" with van damme used his, if you've seen the movie you know what I mean

    Sorry for getting you confused with masj though; its that "ma" both of you guys have in front of your names that makes me confuse u 2.


    Graphics, and 3D polygons don't make great games, story-lines do!!



    Grantu2

  13. #13
    Join Date
    Mar 2001
    Location
    India
    Posts
    7,497
    Thanks
    32
    Thanked 304 Times in 122 Posts
    EP Points
    885

    Default

    Well, as far as 64 bit processing and all goes, nVidia's NV30 shall be able to render the Final Fantasy Movie in realtime. Chew that!
    Also, processing power like 64bit is also required in stuff like Speech Recognition (in realtime, since then it can process more frequencies of your voice at once) and movie editing. Other than that, I can think of little application of it even now. Those are high end uses...
    We still use analog (movie) format, and not rendering to make our movies, I don't see why we soon need to trancend to digitally rendered movies in realtime, maybe only the possibility of making 3D TV!

  14. #14
    Zerostrike Guest

    Default

    Remember it cost sony 300+million dollars to make the ff movie
    Square Pictures made the movie. I think the cost you listed is wrong too, although I'm not sure.

  15. #15
    Join Date
    Jun 2001
    Location
    Laurelind�renan
    Posts
    4,382
    Thanks
    0
    Thanked 0 Times in 0 Posts

    Default

    Another interesting relation with virtual gaming is instead of a headset, you could have a single curved monitor that completely surrounds your head at 360 degrees! Matrox' Parhelia has the triplehead display which is very cool but imagine three more monitors all around you and all six of them blend together.

    Oh, and I knew that the NV30 could render Final Fantasy. Very cool. I want to see exactly what nvidia is planning before I jump on the R300 bandwagon. Who knows, nvidia could be king of the $100 - $300 vid card range again by the end of the year.
    <img src="http://anime.emuparadise.org/sig.jpg" border="0">

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
About Us

We are the oldest retro gaming forum on the internet. The goal of our community is the complete preservation of all retro video games. Started in 2001 as EmuParadise Forums, our community has grown over the past 18 years into one of the biggest gaming platforms on the internet.

Social