@Raype: Again, I'm not looking at this seeing either black or white. Of course there are good games this gen, and there were bad games in Gen5. Which is why I said I have to take time and find innovative games from each gen.
On the AI though, as someone who used to mod Half-Life, playtested the AI a lot, and saw the code, I can say that Half-Life could do all of those. As a matter of fact, Half-Life could do that so well you have to dumb the AI down if you decide to port it to consoles so people playing with those slow analog sticks would have a chance against them. Also, both flanking and covering are mostly scripted (you have to tell them which way is "flanking" etc.) Those "run at you shooting" AIs only belong to really bad games in which developers didn't bother writing an AI at all. We have had them since Gen5 (or before?) and there are still plenty of the kind in games from Gen7.
BUT I'm not talking about shooter AIs. These are actually reverse AIs, you can easily write AI so good that it can kill player from miles away with simple code. All the code there actually exists to dumb it down so it would seem "fair" to the player.
I'm talking about realistic personality AIs. Sort of an improvement from those from STALKER (Gen6, had AI that would travel around the world, band or disband with people, look for food and shelter real time). Was it impossible? No. Was it gimmicky? No. Instead most games still have AI that just stands somewhere (or patrols if really smart) until the player bumps into them.
Think about how advancements in AI could help games. Think people in RPG world whom you talk to directly and they understand what you mean. We have this on cellphones, It is possible. (I have a somewhat fair understanding of how it would work) But it didn't happen. Instead, we're stuck with dumbed down (but more scripted) AI from Gen5.
It's not just AI though. There are a lot of possibilities that developers just don't tackle.
But it may be that my ability to program has lifted my expectations higher than the old times?![]()
ある朝、気がついたんだ
僕はこの世界が嫌いなんだって
@Raype: Wii does not have HDD. And Xbox original had HDD. PS2 could be upgraded with HDD. And PCs had HDD since the olden times.
I would bring up the PC, but didn't Dreamcast have something similar? I mean, it didn't have dashboard chat or whatever, but you could still do the same. This is more gimmicky than every other thing we had mentioned before! (But I probably feel this way because I can't do that with my shitty internet speed) (But steam)
Again, I don't see any point trying to separate PCs and consoles. They're different yes, but not that different you make them out to be. And in my view both consoles and PCs affect each other so you can't just say they're entirely different markets. A big part of my arguments come from the PC side.
I do own a Wii. I bought it for some exclusives and GC compatibility. (GC was never released in my country)
ある朝、気がついたんだ
僕はこの世界が嫌いなんだって
Slow analog sticks? Console Half Life games can use KB+M. And the AI was certainly intact in the PS2 version.
There's a difference between "coded" and "scripted". Scripted is like a cutscene, but in game (so like Half-life's cut scenes). Coded means it's dynamic. If I load up Uncharted 3 (IE: that game from Roxas sig), the AI will pick different cover, flank in different patterns, and utilize different tactics each time I play whether I do anything differently or not. Sometimes they might lob a grenade, sometimes they might blindfire forever, sometimes they'll take cover a lot closer to you. It's certainly more dynamic than Half Life (which largely limited the AI to the human enemies anyway), and definitely more complicated than Splinter Cell's AI (which was script heavy and not that bright when the actual AI program was running). This also doesn't get into things like damage zones and such. UC3 doesn't do it really, but in certain games taking out the legs causes enemies to develop a permanent hobble, while knocking out an arm affects the aim. Half life's AI didn't really respond to being shot in different areas until the sequel and even that was mostly canned AI.
Design choice. If AI's wandered all the time finding them might be annoying. It'd really only be functional in an open world game. Bethesda kinda-sorta did a tiny version of that. So eventually it can happen. Again, PC is usually out ahead when it comes to this stuff. You can't measure consoles by the same ruler. Baby steps.
If you have a fair understanding of how it works, you should understand why it's impossible and also why that digital assistant isn't even an AI. Ignoring that, it's an absolute coding nightmare for something that's really kinda dumb and would lead to a world populated by Microsoft Sam. And that's assuming it even works right. Lifeline (PS2) and the couple dozen games with Kinect integration showcases just how shoddy that tech is at this time when handling anything more complicated than Siri. We're decades from this being remotely viable in a major game application.
That's something of a software issue, and just because no one is doing it now doesn't mean they won't later. Things take time to be do-able. Even longer to be cost efficient and highly functional. The last thing the world needs is half hearted "innovation" at the complete expense of "fun" and "accessibility". It doesn't matter how innovative your game is, if it's not fun, glitchy as hell, and requires a PHD and 10 years of training to play, no one is going to want to touch it.
It's largely standard now though. The PS2 HDD was pretty much just there for FFXI. The XB original had an HDD, but it was a late comer and it didn't quite use it in the same capacity as this gen did right off the bat. It was definitely the innovator and the cause of everything that came afterwards, but I don't think anyone is itching to go back to XB after using the 360, nor would anyone in their right mind want to deal with PS2's online after having the PS3 (in this case, it's a huge, huge downgrade).
The Dreamcast thing was barely there. There's SIX online games for the DC total. In contrast, 6 online capable games came out this week alone for PS3/360. The online was, again, another gimmick. It was there, it helped sell consoles, and people got all excited for it. And then nothing happened. But now it's an actual legitimate part of the gaming experience and it's really changing shit up.
Then you should realize how silly it is to use that argument, seeing as consoles have, up until super recently, been PC gaming for poor people who don't have high end PCs. It was the entry level gaming format. They're a similar market today, but even then you have a lot of bells and whistles (the added AI stuff being one of them) that consoles are simply not capable of because they just don't have the power/infrastructure to handle it. Keep in mind the current consoles are using roughly the specs of a mid-range PC from '04. We might as well bring iOS and Android into it at that point, and lament about how Angry Birds doesn't have a day/night cycle even though Majora's Mask did that shit.![]()
In my book, coded means coded. As in coded in c++ or some other language. This is handled by an AI developer. Scripted is scripted. As in scripted in game specific language. This is done by the level designer. Half-Life had minimal scripts (just waypoints for elevation changes) The rest was handled by code. Today's AI need to be specifically told by the level designer where to cover and which way to flank etc. so they're scripted. What you are describing is not better AI code or less scripted AI, it's just more random AI. I could play with the constants in Half-Life a bit to achieve the same results.
I think it's quite possible. There are things such as Wolfram Alpha and even Google that do this. (It doesn't specifically need to be able to understand spoken language, rather you can type anything you want, and it should be able to answer something that is relevant. That's good AI.) It's not impossible, it just hasn't been done before because the money that could be spent on developing it has been spent on advertisement and marketing. (There's a guy who has written an AI that develops games, nothing is impossible)
I agree with your last paragraph.
Last edited by gezegond; 21st-June-2013 at 07:00.
ある朝、気がついたんだ
僕はこの世界が嫌いなんだって
That's not what it means when it comes to game design though. If the enemies are told "go here, do this", that's script. If they're told "FIND COVER AND KILL THIS GUY" and do the rest themselves that's code. And no, today's AI doesn't have to be told that. At all. This hasn't been necessary for a while and with the class setup of modern uber eloquent devkits it's easy as hell to pull off, which is why it's not that uncommon. If you have to manually tell every character what to do, you're making way more work for yourself than you have to as a programmer. Calling it "random AI" is an absolute misnomer and not even how that works. It runs an AI routine that relies on random number based class selection combined with analysis of the battlefield and some other such nonsense which yields an enemy that gives the illusion of thought because it's intelligently unpredictable and reacts to parameters in unique and different ways each time. As opposed to "run right" or "walk towards player". Which is the whole basis of artificial intelligence. And, yes, a more random AI is, by it's very nature, a better AI. Because intelligence demands randomness. If you can predict the outcome without forcing it, it's not an AI, technically speaking.
....I think it's quite possible. There are things such as Wolfram Alpha and even Google that do this. (It doesn't specifically need to be able to understand spoken language, rather you can type anything you want, and it should be able to answer something that is relevant. That's good AI.) It's not impossible, it just hasn't been done before because the money that could be spent on developing it has been spent on advertisement and marketing. (There's a guy who has written an AI that develops games, nothing is impossible)
I agree with your last paragraph.
Mind if I ask how far along you are on your programming degree?
You're re translating script and code. code is code, and script is script. It's pretty simple. What's written by AI coders is AI code, what's written by level designers in AI script.
Now a level designer can specifically tell a unit to "go here, do this". But it can also tell him where to find cover. As a matter of fact, it has to tell it where to find cover. The placements of covers is none of AI coders business, it's level designer's responsibility.
What programming degree?
I have been studying computer engineering for a while, but am thinking of dropping out and study game design in uk. I already have been accepted at a university there, but they don't give visas to people from my country easily.
They don't teach you much in the class though. Much of my "knowledge" (if you can call it that, I'm at a very beginner level compared to most programmers, but I am trying to get better) comes from modding HL1 and reading articles.
ある朝、気がついたんだ
僕はこの世界が嫌いなんだって
I don't understand what any of that has anything to do with innovation and technology. The innovations in formula 1 is very different from innovations in a city car. But they both have innovations, and they usually borrow from each other. Performance in a certain category doesn't have anything to do with innovation IMO.
There is room for development, improvement, and innovation in your car, a formula 1 race car, or a bicycle.
ある朝、気がついたんだ
僕はこの世界が嫌いなんだって
Yes, but if the AI "code" is taken completely out of the equation, then it's a scripted event. Which is what a lot of people did back in the day to make it *look* like there was an advanced AI under the hood when there wasn't. It was just a really dumb enemy that was designed to do something unusual in that one specific instance. It was just reacting to a player created flag and playing out a little cutscene that'll be identical no matter how many billion times you see it. In short, it's a script. Nowadays we see more coded AI that's actually dependent on actual things that are occurring in the game. When one of these AI sees a player, it reacts in numerous different ways. Hence, better AI and better coded AI, as opposed to illusionary scripted AI that's outright told to act a specific way when the player shows up. Or, in other words, it's the reason goombas walk left all the time but advanced enemies in modern mario games might decide to run right or throw out an actual attack depending on numerous factors, some of which are totally random.
Or, to put it another way, telling a lot of units what to do is terrible game design as a level designer and leads to a shitty product. Often players will simply exploit the AI (look at any poorly designed fighting game to see spectacular examples of this). But telling the AI what its options are so it can make varying choices (which may be dependent on such factors as where the player approaches from, what weapon he's using, and the sort of character the player is running) while throwing in some RNG to make the game different every time the player loads us brilliant game design and the sort of thing you're more likely to see nowadays. If SNK bosses were as easy to predict as Shao Khan, nobody would be raging endlessly about how Geese is a big boiling vat of dicks. They'd be huddled in the corner exploiting AI loops, predictable setups, and bugs for easy victory. Instead the AI reads you and does all of that shit to you instead (which is total cheating and kind of not how this is optimally supposed to work, but it works well enough for this illustrating example).
Then I should illustrate that those AI you're linking are nothing more than toys and the stuff you're expecting of video game AI is decades off, not to mention it'll take someone years (and millions, if not billions of dollars) to implement even in a terrible fashion. I can pop a nice french pastry out of my toaster, it doesn't make my toaster a french chef. Just because someone made a half assed toy that runs some simulations doesn't mean it actually designs games. It's incredibly far from it in the same way that the moon landing is a far cry from colonizing mars. Hell I think that might be generous. This game designing AI is more accurately a couple of generations off from van Braun even being born.What programming degree?
I have been studying computer engineering for a while, but am thinking of dropping out and study game design in uk. I already have been accepted at a university there, but they don't give visas to people from my country easily.
They don't teach you much in the class though. Much of my "knowledge" (if you can call it that, I'm at a very beginner level compared to most programmers, but I am trying to get better) comes from modding HL1 and reading articles.
Wolfram isn't even technically an AI. It's just a gimmicky psuedo AI thrown on top of a search engine so people can attention whore it out as the next big thing. It's a giant remote control for an existing function. It can't interpret or analyze, and it can easily fuck up royally. There's a reason Microsoft didn't showcase the voice commands of the XBone onstage (it was a slideshow). Because it doesn't work. Oh sure, it'll probably work much of the time and all that. Unless you have an accent. Or a speech issue. Or are in a noisy room. Or happen to be slightly too far from the mic. Or happen to speak too fast/slow for its liking. Or you use a different voice command than the one hard coded into it. Or saturn isn't properly aligned with neptune. Or any of the billion things that happens to get in the way of this stuff. And therein lies the problem. It'll work for certain people just fine but everyone else gets fucked. I'm a moderately accented Canadian with an off tone voice and a tendency to speak fast. Voice recognition software hates me with an unbridled passion. God help anyone from Jamaica or Portugal. Hell the Kinect had serious trouble picking up black people on its camera. We can't even do object recognition without serious glitches getting in the way. And command based voice recognition (the only one even remotely viable right now), doesn't even work right all the time. Full on voice recognition and interpretation of a functional capacity is so distant we might not see it within our lifetime. The closest we might get is lots of codespeak phrases (XBOX ON and the like) and that's just replacing your controller with voice commands. Which is already being done with Kinect (Mass Effect 3 had it, as I recall). You're just using your voice instead of moving your thumb a little. It's the wii-mote thing all over again.