Page Summary
Active Entries
- 1: "Traditional Publishing" -- what I expect when I hear that.
- 2: The Duty of a Civilization and Weaponized Naivete
- 3: What current AI should, and should not, be used for -- ideal and realistic
- 4: AI, Testing, and Intelligence
- 5: Elon is right, and also very wrong.
- 6: The Straw Hats are not "conservatives".
- 7: I don't pray...
- 8: Went to the No Kings 2 Protest
- 9: Wasn't as bad as it could be...
- 10: But wait, there's more!
Style Credit
- Style: by
Expand Cut Tags
No cut tags
no subject
Date: 2012-11-12 11:34 pm (UTC)Sounds like you're mainly complaining about RPGs telling a story (singular), when you want to tell your own or at least have a choice of stories. And I don't think any computer games are going to solve this, at least not as games; role-playing chatrooms already exist. And I think the overlap between people interested in actual role-playing and those interested in the game mechanics (and winning) is shrinking rather than growing.
Most especially, however, do not present me with a sequence of events that PREVENTS me from taking an action that I have previously taken and know I CAN take, simply to allow The Plot to continue.
Yeah, deaths of named characters are permanent or not, depending on what the plot wants (and in about half of the former cases the plot will outright lie to you). This is one of the many reasons I try to ignore the story in games I play.
So really, am I asking so much? All I want is:
New games at reasonable ($40) prices
All major content included
At least 1 hour of gameplay (following the main storyline, not counting all sidequests) for each dollar of price, preferably 2 hours
I can give you links to lots and lots of free RPGs (there are multiple websites just for RPGMaker games), but I'm not sure if you'd like any. Of course none will give you the freedom to tell your own story, and they won't have fancy graphics or music.
Lots of control over game options:
a. Ability to vary overall game difficulty (most games do have this)
b. Ability to skip past major challenges that the player finds too arduous
c. Ability to vary “focus” of the game to player preference (hack-and-slash, epic dramatics, romance, etc.)
d. Possibly variation of mechanics (turn-based VS real-time combat, etc.)
I have seen the second item implemented in some games: a "story mode" that turns off random encounters and either skips boss battles or lets you edit your char's stats right before them. The third and fourth items would greatly increase production costs (and probably take 2 or 3 times as long to make and test the game), just to add things that most players wouldn't ever see/want (different things for different players). There was a comment I saw recently saying that the biggest problem with modern commercial software is that the creators are trying to attract all potential customers at the same time, instead of giving a subset of them what they actually want/need.
no subject
Date: 2012-11-13 04:46 am (UTC)Not quite. I have no problem with them telling their story. I do have a problem with them telling a story in such a way that it's obvious that I'm actually not playing the character in the story. If there are logical choices for a character to make, they should be covered by the game.
no subject
Date: 2012-11-13 01:10 am (UTC)I think you're unlikely to see "ability to vary focus" and "ability to vary mechanics" in any really extensive sense... some games have a little of this already (Fallout 3 lets you fight in real-time or pause and queue up actions, to some extent; Mass Effect 3 gives you some control over whether you want lots of story and easy action or hard action and not much story, or something in between) but to do it fully pretty much requires that the developers write several games and put them in the same box. That would cost money and time, and most game development is short on at least one of those things.
Getting the kind of story flexibility you want is hard too... not impossible, by any means, but there are at least two challenges: one technical, one artistic. Technically, you would need a pretty sophisticated story engine to handle all the possible tracks and variations of the possible plotlines, while also tracking how every NPC is going to interact with you. I am distantly acquainted with some people who work on this kind of thing, and while it can certanly be done, it is time and money taken away from shiny graphics or crisp action.
The artistic challenge is that it's hard to write that kind of widely-branching story and all the NPC dialogue. Again, not impossible, but hard.
And from what I've seen, game developers are skeptical that there's enough of a demand for this kind of thing to warrant investing in it... and as much as I, too, would like to see it (heck, I'd like to work on it) I fear that they're right.
Another thing I've noticed about CRPGs when they come up in discussion is that a significant number of people are competists -- they want to be able to know that they've seen all of the game's content. If not in a single play-through, at least in a reasonable number. If the game is too wide-open and multivariate, they're going to be frustrated.
Where I've settled, in my own mind, is that CRPGs tell stories that are, in some ways, different from other stories, and that's just the nature of the beast. "Hurry urgently to this location, there's no time to lose!" in a CRPG means "Furst, go everywhere you can get to that isn't that location, looking for loot and side quests," in the same way that "Here's your brand-new spy car, Mr. Bond" means "You are pretty much guaranteed to be wrecking this in about forty-five minutes"; and if that bugs you enough to bump you out of the story, you need to be engaging a different kind of story.
Or at least so I tell myself, while sighing over the foibles of the latest game.
no subject
Date: 2012-11-13 09:27 pm (UTC)No sane development team would even try. Instead they'd try something along the lines of a natural language processor, a la ELIZA, combined with an adaptive neural network to manage everything. Except that they wouldn't do that, either. Playing against neural networks isn't fun.
no subject
Date: 2012-11-13 09:11 pm (UTC)In a pencil and paper game the players create the story as they go. In a computer game, as in a novel, the story is already written.
Pencil and paper games are constrained by the imaginations of the players. Computer RPG are constrained by what the development teams write for them. There is only so much that a development team can deliver within a given time span.
Computer games are also constrained by the style of story-telling involved. Open worlds can't tell a coherent narrative without fences. Imagine an open world like Skyrim, where the player can visit important sites in any order he wishes, being used to tell an epic narrative like The Iliad. It would be like debinding a book, tossing the pages off a cliff, and reading the pages as you find them. Open worlds can be used for story-telling but a different kind of story-telling. They're good at telling lots of short, loosely connected stories. An open world would be a most excellent way of exploring Aesop's fables.
I do agree with the sentiment, though. Some game studios -- I'm pointing at you, Square -- spend vastly disproportionate resources on visuals. Gameplay suffers as a result. If you don't have good gameplay then you don't have a good game. Great visuals can make a good game better -- I'm pointing at you, Assassin's Creed -- but if you don't have the gameplay to back it up then you're just wasting time and money.
no subject
Date: 2012-11-14 12:18 am (UTC)There is a big difference between "Can't provide the full RPG experience" and "doesn't even make the effort someone did fifteen years ago".
I don't expect games (in the short term; in the long term, I most certainly do; an AI GM will solve all the problems) to provide a full RPG experience. I DO expect them to give me MORE flexibility, MORE story, and MORE choices than they did fifteen years ago. I expect them to not make me put up with the compromises the SNES forced on programmers. I expect them to do BETTER AT FOOLING ME. I don't expect them to ACTUALLY fool me for long, but I DO want them to put enough effort into it that I'm not constantly tripping over the glaring omissions.
I also expect games with different types of content -- especially if you IMPLY that content is there -- that can be accessed by the player choosing to do so.
I expect my choices to MATTER. Chrono Trigger managed 15 separate endings, of which at least 5 were significant in their departure from each other.
I expect the NPCs to become more and more like people, which means they remember things and react appropriately -- i.e., if you remember I'm the Hero of Kvatch, you take that into account when you think about arresting me by yourself.
Skyrim actually manages to tell a quite coherent narrative with a minimum of fences; yes, you can depart from a quest line as you like, but the narrative remains coherent across the individual quests and connected sets of quests. That mechanic works well and I'm quite satisfied with that aspect of it. I'm even reasonably positive about their method of showing where to go to solve your quest; in a fully-realized world there'd be SO much more information available to you, but the game (or even a good GM!) can't provide ALL of it for you, so such useful yet not-intrusive indicators are quite good substitutes.
no subject
Date: 2012-11-14 02:05 am (UTC)As for Skyrim and other games' NPCs? It's no different than what happens when you balance a jar on a guard's head. When you come back three months later? It'll still be there. Every computer game will have disconnects like that. A wise GM once told me that there are three solutions to every challenge. The first is the one the GM wants from his players. The second is the one the GM expects from his players. The third is what the players actually do. No development team can handle all possible cases of number three. No AI can handle all permutations of number three. Players will figure out or stumble across things the developers missed and ways to game the AI. Or they'll ignore it. On the other hand, a human GM in the here-and-now hopefully has the wisdom to recognize the absurdity and choose an appropriate course to take in response.
I find your take on this to be very interesting. Chrono Trigger has some worse (IMO) disconnects than any of these. This tells me that it's less about what the games you play are delivering and more about your willingness to suspend your disbelief and allow yourself to be immersed in the game despite the disconnects. You have the expectation that because Skyrim is fifteen years newer than Chrono Trigger it should be fifteen years better. If only reality were so kind. While you and I have been gaming over the past fifteen years, most of the development team on Skyrim were going to elementary and high school. It's ironic. The more role-playing possibilities that we want from games, the more that studios need to rely on staff with little or no role-playing experience.
no subject
Date: 2012-11-14 04:44 am (UTC)"AI is not the answer. We've had AI since around 1965."
Er, no, we haven't. I don't mean the silly expert systems, neural nets, etc. I mean REAL AI. I expect that somewhere on the order of 50 years from now. (note that I say order, not CLOSE order, as in I'd be exceedingly surprised to see it pop up in five years, but equally surprised if it doesn't show up in five hundred).
Certainly no limited program can handle ALL the cases. I just want them to handle MORE of the cases. Ultimately, I want the Holodeck, or the simgames I describe in Grand Central Arena, but those are a couple centuries out. As a live GM myself, I'd really like some tools that let me, for instance, create imagery of my own world easily. But while wonderful strides have been made in those areas, we're still a long way away from that too.
And yes, I expect -- or rather, I don't EXPECT, but I WANT -- 15 years of progress in 15 years. The fact the current programmers were in elementary school is irrelevant; the advantage of being intelligent, literate beings is that we can build on the knowledge of those who came before (and I was actually roleplaying when I was in elementary school, I just didn't have D&D's mechanics...) That's why I'm writing this on a laptop and sending it to you via electronic means rather than chipping it painfully into a stone tablet in pictograms. They've had plenty of time to figure out how to make jawdropping super-graphics and all sorts of other bells and whistles; I just want them to put the same amount of effort into the story and world behavior that they do into making it look flashy. (Star Ocean 2 was a good example; they didn't have the best graphics in any sense of the world, except for their opening FMV, but BOY did they try to give me some sense of a more complex set of people and choices. Succeed in fooling me, no, but they did get considerable points for taking some time and making an effort)
Give ME the salary of a major team head on one of those games, and the two years they give you for game development... Well, let's just say I've spent quite a bit of time thinking about it.
Understand, I KNOW what machines do. I understand how their logic tends to work and what the decision trees and so on that the programmers have to make are like, and how their difficulty can increase exponentially.
I also understand that they're spending a lot more money and manpower on this than I could easily imagine. So if I can spend a year and write six novels (and if I could write full time, I damn well could; I average slightly over two novels a year writing an average of a day per week), I don't think it's at ALL unreasonable to ask a team of dozens of people to give me the equivalent of ONE novel with six major paths through it with multiple possible outcomes. I really don't.
And given that (for instance) Skyrim and its relatives already HAVE some sort of "fame/infamy" switching in place, it seems even less demanding of me that they add in a little mechanic of "holy crap, I don't want to MESS with this guy", and other similar things.
no subject
Date: 2012-11-14 06:36 am (UTC)Neural network systems are real AI. They learn in the same ways that living things learn. That's the point. They replicate biological neural networks. In addition to problem solving and fraud detection they've provided tremendous insight into how our own brains function by accurately replicating human brain function. That's as real as intelligence gets.
What science fiction calls artificial intelligence isn't AI at all. What science fiction calls AI are usually sophisticated expert systems with access to massive data stores. Siri and Google Now are perfect examples and you don't have to wait 50 years for them. The exceptions are what I call artificial sapience: an awareness, a consciousness. Such a thing isn't a program. It's a living being.
Getting back to the game aspect of it: We as players see these disconnects as problems but developers don't see disconnects at all. It's entirely likely that two or more different dev groups were responsible for various stages of NPC reactions. They don't see a disconnect within the game because they can't see what's going on outside of their respective compartments. Even if they do then it's not their problem because it's not a bug. The script calls for the NPC guard to try to arrest the PC. If the NPC guard tries then it's working as designed.
The problem is the writing. Whoever wrote that NPC's dialogue neglected to factor the possibility that the PC would be the Hero of Whatever. Maybe the writer was negligent. Maybe he forgot. Maybe he never knew of it (see previous about compartments). Regardless, the possibility wasn't included in the script so appropriate contingencies weren't coded in the game.
This is a problem with every large scale project. The individual groups can't see the whole and the few who can see the whole are busy with more important things than a minor continuity error in a minor NPC's dialogue. It's the ironic conundrum: you want more and better which requires more people involved, but more people involved makes for more little errors.
We *have* come a long way. Thirty years ago we had Rogue and Wizardry, simplistic games written by teams of one or two people that offered a basic tactical D&D kind of experience. Twenty years ago we got Final Fantasy and similar games that took the Rogue and Wizardry style of gameplay and used them to tell narrative stories. Ten years ago we got Neverwinter Nights which gave us the conversation simulator as a fundamental component of gameplay where your choices as a player have direct effects on the narrative. Today we have games that feel almost perfectly real. Ranting about a minor NPC's glitchy dialogue seems silly to me in that light.
Then again, I rant about bad fight choreography in movies and the terrible gimmicks that cinematographers use to cover it, so who am I to judge?
no subject
Date: 2012-11-14 07:55 pm (UTC)Is that all that comes away from my entire discussion?
It's not just that. That's just one obvious example out of thousands, all of which have to do with the same basic problem: that they haven't really devoted much time to making the world work like a world, and thus there's dozens of things that throw me out of the story they're trying to tell. Yes, some of that would be near impossible. A lot of it wouldn't be, and if you devoted the effort to building the system in the first place, it'd make things EASIER in the longer run.
no subject
Date: 2012-11-14 09:40 pm (UTC)What you propose wouldn't be easier and it wouldn't make things better. Let me try to put it in terms of writing. That might help make my point.
When I used to write I started with a general outline of the story as a whole. When it came to actually writing I wrote one scene at a time. Sometimes scenes didn't quite match up do to a perceived need to write scenes out of chronological order. Sometimes scenes didn't match up because a collaborative group of authors aren't going to write exactly the same way. Sometimes entire stories didn't match up. Either way, a good editor would glue the scenes together, cleaning up what didn't fit or sending it all back for a rewrite. This is hard as in approaching NP-hard. If the editor doesn't read the whole work then he won't catch all of the discontinuities.
Expand that complexity to a choose your own style book with maybe a dozen permutations. The editor and assistants must read through many times to ensure that all of the permutations make sense at the ends. If they don't then they can't be sure they caught any broken paths. There are shortcuts that can be taken, certainly, but each path needs to be traced to ensure there are no broken ends that aren't intended.
Expand it again to a game like Chrono Trigger with hundreds of permutations. The "big cheat" with Chrono Trigger is that despite the number of permutations they all converge on five discrete endings. This reduces the complexity of the problem. It's still hard, just not nearly as hard as it would be with an open-ended system.
And then there's Skyrim, a game with ostensibly infinite permutations. It's not really infinite but the number is huge and there are few convergence cheats to simplify the problem. A player can choose to ignore the end game entirely but continue playing, for example. It is impossible for all practical purposes for a studio's QA people to exhaustively trace through every path through a game like Skyrim. It would take them decades at the least. They don't try. They take the "shortcut" of testing the most commonly anticipated event sequences and ensuring they work as designed and if the events pass the checklists then that aspect of the game gets a passing grade. Most of this is automated so waving around the "true AI" magic wand can't make it go much faster.
All of the problems in a game cannot be fixed without sufficient time to test, find, fix and repeat. The bigger the project, the more discontinuities it will have and the more time it takes to find and fix them all. How a game is assembled won't change this.
And all of that is an obtuse way of saying that if you don't like discontinuities in big, complex, open-ended games then don't play big, complex, open-ended games. They're not for you.
no subject
Date: 2012-11-14 11:20 pm (UTC)As for "true AI magic wand", you seem to misunderstand. You seem to be postulating using the True AI to do the ridiculous "check all alternatives for consistency", which is indeed an impossible task as the number of choices increases. That's not what you'd use a true AI for in a game context. You'd use them as the GM. I don't "check all alternatives for consistency" in a branching choice tree, either when I write, or when I run games. I construct a WORLD. The world answers 99.9% of all my questions of what happens and how. I envision a plot, and follow what logically happens to the people, places, and things as they travel along the plot.
When I write a book, I often envision multiple branch points. Since I am only interested in the one plot I don't write those out, but if you asked me to (for example) write a game of "Phoenix Rising" or "Grand Central Arena", I would summarize all of those choices and their likely overall consequences.
Yes, if you design a game by having one person come up with a general idea, and then segregate different parts of it to different people and not have them all work together, you'll have a patchwork that has to be hacked together and the edges covered up with glue. But that's not a good way to design a story. There's a REASON that most novels are not written by a dozen people, but by one or two people who work VERY closely together. The reason is that one person can maintain a coherency and consistency of narrative, as well as world, that is VASTLY harder for a committee. That's why the D&D worlds tend to be difficult-to-understand mishmashes when viewed overall, even if individual pieces -- which were mostly the work of one person -- can be reasonably consistent and solid.
no subject
Date: 2012-11-15 12:26 am (UTC)Your right way/wrong way ideas are nonsense. There is no right way to design and build a program. There is no wrong way to do it, either. There are a number of philosophies and methodologies to choose from. Some are better for some kinds of development but worse for others. Even if a studio follows your "right way" there will be a variety of different impressions and interpretations of the script. Even if the world itself is perfect the implementation of it is going to have discontinuities because other people are involved in the build process. The only way to avoid this is to do it all yourself and that's a non-starter for A-list titles today.
And back around to the AI nonsense. An AI will never be a good GM. It's another simple reason: a program can't have fun. It can't enjoy the experience of playing the game. It can only do what it's written to do.
no subject
Date: 2012-11-15 02:01 am (UTC)however:
"And back around to the AI nonsense. An AI will never be a good GM. It's another simple reason: a program can't have fun. It can't enjoy the experience of playing the game. It can only do what it's written to do. "
A true AI would BE a person. We're nothing but complex programs -- accidental ones running on biological strata, but unless you're a dualist you should realize that this means we are purely a product of physical phenomena, and if your computer can duplicate that process then it would be just as much a person as you are. These are the types of AIs I depict in Grand Central Arena and that are common elsewhere in SF.
If you ARE a dualist -- if you think there's some nonphysically duplicable component of humanity that makes it impossible for us to make a computer to properly simulate -- then we're simply talking from different religious points of view. I admittedly can't PROVE there is no nonphysical portion of us, a soul, a spirit, whatever, but I haven't seen any evidence that there IS.
Yes, if you envision an AI as being something that's hard coded, every line entered by a human being who throught it through, yeah, it's gonna be a non-person.
But that's not how you'd do it. You want to make a computational person, you give it the same ability we do: to modify and program itself, in a wide variety of ways. Maybe you even RAISE it like a person. How, exactly, you design the core "seed", well, that's what I expect to see in the order of 50 years. Worst comes to worst, you get a computer big enough to simulate every single neuron and interconnection in a human brain, and all the biological processing that goes on within each cell, and start that puppy running with inputs that simulate what the brain gets from the rest of the body.
But I doubt it will require taking that extremely crude brute-force approach; I suspect someone will figure out a more elegant way to do it in the next few centuries.
We haven't succeeded yet because we're still unsure of how WE do what we do. We know little tiny disconnected pieces of it, but no idea of how it all comes together to make a person with consciousness and emotions and so on.
no subject
Date: 2012-11-15 06:02 am (UTC)I wrote as much way back at the beginning of this. I've used the term "artificial sapience" several times in this context. And I repeat myself: what you call AI is not intelligence (knowledge) but sapience (awareness). It's life.
Which opens up a huge can of philosophical and legal worms. What is a person? If a thing is indistinguishable from a person then is it also a person? If it is a person then does it have the same legal rights and privileges as a natural person? And if so then would confining it to an embedded system such as a personal information manager or a ship's navigation console or a video game console be a form of slavery?
We're nothing but complex programs
I disagree. A program is a formalized sequence of instructions. To say that we are nothing more than programs suggests that we are toys in Sims game with just a simulation of creativity and will programmed by someone or something else. I don't like that idea. Accepting it is, in my mind, tantamount to rejecting my existence.
I don't consider myself to be a dualist. At the same time I admit that my ideology does skirt the edges of dualism.
But I doubt it will require taking that extremely crude brute-force approach; I suspect someone will figure out a more elegant way to do it in the next few centuries.
You're behind the times by over a century. The formal concept of biological neural networks as the basis for human cognition was introduced in the mid-to-late 1800s. Those theories have held up to over 100 years of scrutiny, and they've held up to over 50 years of computational simulation and analysis.
no subject
Date: 2012-11-15 01:19 pm (UTC)This may End Badly if we don't figure out how to address it, since there's at least some good reason to believe that if we succeeded, we could also succeed in making things whose cognitive capabilities were vastly greater than our own. Keeping something with the intellect of a demigod as a slave probably won't engender good feelings.
Jeez, that certainly wasn't the impression I got in my AI classes back in college. I remember extensive discussions of the shortfalls of neural networks and the fact that we knew so very little about human cognition that it was pretty much impossible to say exactly how it worked. Oh, you could show "toy" versions of some things that human beings do -- very, very simple ones -- but you could do the same with fuzzy expert systems, too.
AFAIK, we're still working on making neural networks on the same scale as not-terribly complex invertebrates, moving into higher insects. So how we can say ANYTHING about how well, or badly, this translates to what human beings do with their higher cognitive functions, I really don't know.
Admittedly, I only touch on this field occasionally these days, but I do know that insofar as doing things that are USEFUL we've still found that neural networks weren't the way to go, at least here at work -- and some of what we do would seem ideal for perceptual neural network applications.
I think that at MOST one can say "well, we're all built of neurons upstairs, and if you have a sufficiently accurate simulation of what neurons (and their associated cells) do, and you make that simulation cover the entirety of a human brain, you could probably make a simulation of a human mind". Though I'm not sure how you'd (for example) determine what the "weighting" of each neuron, and the precise characteristics of their reverse propagation, etc., would be in order to have the necessary "kernel" in place that allows a baby to basically learn everything about the world from scratch. We know for a fact that if you don't make a neural net with the proper weighting and other factors it just won't perform well.
And TRAINING it is a bitch and a half for any serious task, which is an area we're having to deal with now (I don't think they're using neural nets in this particular case at IEM, specifically, but they are using one of the related approaches that uses training sets for recognition, and boy, is it hard to be sure you have enough training sets, with enough diversity, and no spurious common components that it might turn out to be training on instead...)
no subject
Date: 2012-11-15 06:02 pm (UTC)I say this because we've already done it. Sort of. The IBM Blue Gene/P supercomputer has simulated the nuron count of an entire rat's brain but they didn't get the cognitive process of a rat out of the simulation. Adding neurons doesn't make the simulation any more life-like. It stands to reason that increasing the neuron count to the approximately 85 billion neurons in an adult human's brain won't generate a human-like intelligence.
I did not intend to suggest that neural networks are the only viable theory of human cognition. Neural network theory appears to be the most accurate description of how natural learning and cognition work. It's a foundation for research, not the be-all, end-all in cognitive science.
Fuzzy logic, on the other foot, isn't cognitive science at all. It's a kind of set theory. It's primary use is in decision making processes given incomplete or unreliable input data. This is valuable to certain kinds of expert systems like meteorology and weather simulations.
Back on the first appendage, neural networks excel when adaptive learning is desired. I mentioned credit card fraud detection already. These systems build dynamic data structures that model each card holder's buying habits and flag transactions that don't fit the models. Neural nets are also good for adaptive pattern matching which includes vision and hearing/recognition.
no subject
Date: 2012-11-15 08:13 pm (UTC)The test with the rat brain sim proved nothing, really. We aren't born as just collections of identical, undifferentiated neurons. Those neurons are grown together in some kind of pattern, and they start out "programmed", in a sense. If you want the NNet parallel, the weights and backpropogation and so on have some kind of pre-sets so that when the brain starts running, it can already start making sense of the world.
There is some kind of a "kernel", a core program of sorts, that tells the brain how it can learn to learn -- how it can extract patterns of meaning and associate them with other patterns. This may be hardcoded in some part of the overall brain structure.
Looking at the entry I see for the project you mention, it appears that they are indeed ATTEMPTING to simulate a full rat brain, but haven't gotten there (2014 is apparently the goal for full-scale ratbrain).
While it does, to my surprise, appear they're actually trying to simulate the operation of real neurons, it does not appear that they're also including other crucial aspects -- for instance, the brain simulations may not include simulations of the biological processes that affect all the neurons and their performance in multiple circumstances (i.e., the various endocrine systems, etc.), and it does not appear have the inputs or even bandwidth FOR inputs that matches the scale of input that a rat normally has from its ears, eyes, nose, mouth, and skin/vibrissae. Without that, it will have no data to process to learn what it meant to BE a rat, or act like one.
We don't KNOW how the brain "bootstraps" into being a self-programming, learning thing that goes from "makes vague noises" to "demanding daddy sit down and read her a story". We don't know what that "kernel" is, to use my terminology -- is it in the precise linking structure of neurons? In the chemical state of the neurons at their start? Something in the support structure (glial cells)? Is it hard-coded in structure or is it more software that's emergent from the way the brain grows from a single cell?
no subject
Date: 2012-11-16 01:18 am (UTC)Your surprise about the Blue Gene/P rat brain project is due to your expectation that the team wants to simulate a real rat, which they don't. At least that's not their goal. It's not meant to be a rat. There's no point. A simulated rat cannot be guaranteed to behave exactly the same way a real rat would. That makes it useless as an experimental subject and as an observational control for experiments with real rats. The goal is to create a system with the neurological complexity of a rat's brain. This is so much more valuable than trying to be a rat. There are all sorts of computational experiments that can be done with it. But a real rat? Cheaper and easier to breed them by the gross.
no subject
Date: 2012-11-16 01:55 am (UTC)My surprise was that they'd actually gotten far enough to do that level of simulation. But making a system with the neurological complexity, but no guarantee it actually DOES anything? I don't see what value that has. Finding out how to actually simulate a rat? Infinitely more valuable, because then you actually start to understand how life, and thought, WORKS.
no subject
Date: 2012-11-16 02:34 am (UTC)