Welcome, Guest. Please login or register.
July 05, 2025, 01:18:30 AM

Login with username, password and session length

Search:     Advanced search
we're back, baby
*
Home Help Search Login Register
f13.net  |  f13.net General Forums  |  Gaming  |  Topic: Mass Effect 2 *spoilers around pg 29/30* 0 Members and 2 Guests are viewing this topic.
Pages: 1 ... 22 23 [24] 25 26 ... 83 Go Down Print
Author Topic: Mass Effect 2 *spoilers around pg 29/30*  (Read 629847 times)
Lakov_Sanite
Terracotta Army
Posts: 7590


Reply #805 on: February 16, 2010, 07:30:48 AM

I found full renegade Thoroughly enjoying.  The best part is it's not goody two shoes vs kitten killing evil in ME2 as it is in many bioware games.  Shepard is saving the galaxy one way or another.  The only difference is if you wanna catch reapers with honey, or by shoving your N-7 boot up someone's ass.

~a horrific, dark simulacrum that glares balefully at us, with evil intent.
NowhereMan
Terracotta Army
Posts: 7353


Reply #806 on: February 16, 2010, 07:40:39 AM

Also talking about actual gameplay, going through as a soldier did feel significantly more FPSish than ME1 did even when I was bothering to direct squad mates to target enemies and get behind cover. It was all run and gun and ammo management. I would have a liked a bit more inventory stuff, maybe not to the level of ME1 where there was huge tons of junk loot but having some more options with weapons would have been nice. Sniper Rifle and Heavy weapons were the only weapon types that offered any real choice, the others were obvious upgrades. Shepard's armour offered some selection in terms of choosing bonuses but personally I preferred to have seen bonuses handled more through items than purely research even if it was just offering some decent armour. I'd also really, really have liked to have been able to give armour to party members as well, which I guess is really all part of a more traditional PC style RPG experience. ME2 just feels a little too stripped down to plot and action, although that's handled really well.

"Look at my car. Do you think that was bought with the earnest love of geeks?" - HaemishM
Shrike
Terracotta Army
Posts: 939


Reply #807 on: February 16, 2010, 09:12:41 AM

I've done both renegade and paragon Shepards. They're both fun, just in different ways. Sometimes the differences between the two aren't that apparent, either. Interestingly, my renegade Shep had almost full renegade and over 3/4 paragon. He had some interesting decisions to make. There are a LOT of cutscene interrupts that you just have to do. I mean, you can't resist...


Then there's the last mission--again.


Now, time to bust out ME1 and really get the final result I'm looking for going into ME3. Damnit, not sure I can last a year or two!
NowhereMan
Terracotta Army
Posts: 7353


Reply #808 on: February 16, 2010, 09:20:13 AM

Other Renegade moment of note

"Look at my car. Do you think that was bought with the earnest love of geeks?" - HaemishM
Ard
Terracotta Army
Posts: 1887


Reply #809 on: February 16, 2010, 09:38:40 AM

^
NowhereMan
Terracotta Army
Posts: 7353


Reply #810 on: February 16, 2010, 09:40:47 AM

Hah, that was practically the only one I did actually go through with. Damn it felt good for my generally good Paragon Shepard to do, what with being pissed off at all the general anti-human sentiment and discovering I'd killed Wrex he had to blow off some steam.

"Look at my car. Do you think that was bought with the earnest love of geeks?" - HaemishM
Engels
Terracotta Army
Posts: 9029

inflicts shingles.


Reply #811 on: February 16, 2010, 09:42:55 AM

I nearly always take the red mouse click options, even if I'm doing a paragon run. Am I doing this wrong? Does getting renengade points when there isn't a paragon option somehow detrimental to the paragon build?

I should get back to nature, too.  You know, like going to a shop for groceries instead of the computer.  Maybe a condo in the woods that doesn't even have a health club or restaurant attached.  Buy a car with only two cup holders or something. -Signe

I LIKE being bounced around by Tonkors. - Lantyssa

Babies shooting themselves in the head is the state bird of West Virginia. - schild
Khaldun
Terracotta Army
Posts: 15189


Reply #812 on: February 16, 2010, 09:47:42 AM

There are a few renegade trigger pulls that are absolutely fucking irresistable, yes. Especially the ones where the alternative is to stand there and let some asshole open up on you.
Shrike
Terracotta Army
Posts: 939


Reply #813 on: February 16, 2010, 10:04:49 AM

It doesn't hurt anything and most are nigh-on irresistable. My paragon Shepard was max paragon and about 1 bar into renegade. No problems there, though I did get my renegade on a bit in the last mission since I was maxed otherwise. I do like Shep's "We're going to break it off in their ass--sideways" monologues at the end.

My only real concern going into ME3 is what reactions you provoke in TIM at the final, um, debrief. I took the middle road (mostly) on that one and was pleased with the result. I think. Guess we'll see in a year or two.
Ard
Terracotta Army
Posts: 1887


Reply #814 on: February 16, 2010, 10:17:04 AM

There are a few renegade trigger pulls that are absolutely fucking irresistable, yes. Especially the ones where the alternative is to stand there and let some asshole open up on you.


For me, I mostly took those as "I'm an unstoppable killing machine of a soldier", and let them take the first shot, so I can gun them down mercilessly.  Just because the whole universe is in danger, doesn't mean civilization should come to a stop.  That's the road to the dark side, dammit.  Now pass me my tea.
Stormwaltz
Terracotta Army
Posts: 2918


Reply #815 on: February 16, 2010, 10:22:45 AM

Emotions as an abstract concept are orthogonal to whether a life-form is organic or not.

I believe emotions in "life as we know it" are largely a product of chemical processes in the meat brain; hormones, phermones, adrenaline, etc.

So from my perspective, while organic life may evolve without responses akin to emotions, electronic life cannot evolve with responses akin to emotions.

Note I said "evolve." The geth are a "ground up" AI that evolved from non-sentient code. EDI and the other AIs in the IP are "top down" models designed and coded specifically to gain sapience. If they're programmed to have responses akin to emotions, they will. EDI has a sense of humor, for example, but she doesn't have the capability to get mad. You don't want your starship OS getting mad at you.

Nothing in this post represents the views of my current or previous employers.

"Isn't that just like an elf? Brings a spell to a gun fight."

"Sci-Fi writers don't invent the future, they market it."
- Henry Cobb
NowhereMan
Terracotta Army
Posts: 7353


Reply #816 on: February 16, 2010, 10:32:51 AM

It's true if you take emotions purely as what they physically are but looking at them functionally they basically as perception/action heuristic devices focusing our attention on certain details and shunting certain sorts of actions to the front of our awareness. Like I said earlier AI with sufficient processing power might not need them but in many ways they make very efficient use of the processing available. In AI terms it would be like the Geth having sets of lower level programmes that can mark certain courses of action or elements of the environment as more or less important/urgent. The Geth idea is interesting because there's a wholly alien life-form at work but in some ways they're weird because 1) They don't seem to have any real conception of emotions at all, purely analytic/mathematical and 2) They've got the whole swapping in and out of embodiment thing going on. In Sci Fi terms it's easy enough to deal with in terms of the group mind but it does seem odd that a mind that is perfectly suited to working in an abstract mathematical environment is also perfectly capable of navigating and dealing with the physical world and interacting with organics. But then based on Thane as well you've got some pretty strongly Dualist ideas on mind and body so it's not really that different from those organics' attitudes towards it.

Critique stuff aside I really loved the writing and ideas, I'll try not to derail form talking about blowing shit up and Shepard having to choke a bitch. awesome, for real

"Look at my car. Do you think that was bought with the earnest love of geeks?" - HaemishM
Khaldun
Terracotta Army
Posts: 15189


Reply #817 on: February 16, 2010, 10:56:10 AM

Emotions as an abstract concept are orthogonal to whether a life-form is organic or not.

I believe emotions in "life as we know it" are largely a product of chemical processes in the meat brain; hormones, phermones, adrenaline, etc.

So from my perspective, while organic life may evolve without responses akin to emotions, electronic life cannot evolve with responses akin to emotions.

Note I said "evolve." The geth are a "ground up" AI that evolved from non-sentient code. EDI and the other AIs in the IP are "top down" models designed and coded specifically to gain sapience. If they're programmed to have responses akin to emotions, they will. EDI has a sense of humor, for example, but she doesn't have the capability to get mad. You don't want your starship OS getting mad at you.

I just think this is a weird assumption, that digital or electronic life will not have responses that are emotional in some sense or another. If you assume that electronic sentience will be designed, then it could just as easily be designed to have some feedback loops in its intelligence that have an emotional character. If you assume artificial sentience would evolve from some non-intelligent machine or code, this is even more likely. Anything that evolves is changing in relationship to an external environment that it does not control and is separate from. Any substantial evolutionary change (such as developing independent sentience) is almost by definition going to be an emergent, unintended consequence of multiple simultaneous interactions between external environments and internal characteristics that impose a fitness test of some kind on the thing which has evolved. That leaves all sorts of room for an artificial organism to have internal dynamics and responses which it doesn't fully control.

Do geth units try to avoid being destroyed? I assume so: even a networked organism is going to have resource limitations and is not going to want to constantly build new bodies. If the geth network tries to anticipate circumstances which lead to destruction of units, that's a platform for *feelings* of some kind or another. Do the geth model the actions of other sentients unlike themselves? If not, they're purely solipsistic and shouldn't be able to communicate with Sheperd et al in any way. If they do model other sentients, they have to be able to model (and enact) subjective mental states. That's part of why organics have emotions: we're trying to develop anticipatory simulations of future actions that include models of the consciousness of others. Do the geth have preferences about outcomes of their existence to which they shape their actions? Obviously even the non-heretics do in that they prefer autonomy to subjugation to the Quarrians. That preference is a potential platform for something like emotions--it's a subjective preference, not a purely objective, rational one. etc.
« Last Edit: February 16, 2010, 10:57:48 AM by Khaldun »
Fabricated
Moderator
Posts: 8978

~Living the Dream~


WWW
Reply #818 on: February 16, 2010, 10:57:16 AM

Finally picked up ME2. Zaheed really, really feels tacked on unless he has some really awesome moments in the main campaign.

"The world is populated in the main by people who should not exist." - George Bernard Shaw
Lakov_Sanite
Terracotta Army
Posts: 7590


Reply #819 on: February 16, 2010, 11:08:29 AM

Finally picked up ME2. Zaheed really, really feels tacked on unless he has some really awesome moments in the main campaign.

His loyalty quest is a lot of fun but no, he really has nothing to do with the main campaign at all...which is a shame because to be he was far more interesting than jacob.

~a horrific, dark simulacrum that glares balefully at us, with evil intent.
tmp
Terracotta Army
Posts: 4257

POW! Right in the Kisser!


Reply #820 on: February 16, 2010, 11:26:10 AM

I believe emotions in "life as we know it" are largely a product of chemical processes in the meat brain; hormones, phermones, adrenaline, etc.
That's how i'd view it too. Fear, pleasure etc putting the body into fight-or-flee state, or being incentive to pursue beneficial goals, or instinctual impulse to preserve one's species or just themselves... this sort of things. A computer program wouldn't exactly need these because it can just fine consciously calculate and/or measure such factors, while still maintaining regular mental processes without interruption.
Stormwaltz
Terracotta Army
Posts: 2918


Reply #821 on: February 16, 2010, 12:31:09 PM

I just think this is a weird assumption, that digital or electronic life will not have responses that are emotional in some sense or another.

I see it as a question of what triggers the response.

In humans, the chemical processes in our bodies are involuntary and influence our higher cognitive processes. In other words, our "hardware" has a degree of control over our "software."

In an electronic intelligence, hardware is simply a conduit that passes input on to cognitive decision making software to be analyzed. A microphone doesn't flinch from a loud noise.

Nothing in this post represents the views of my current or previous employers.

"Isn't that just like an elf? Brings a spell to a gun fight."

"Sci-Fi writers don't invent the future, they market it."
- Henry Cobb
Fabricated
Moderator
Posts: 8978

~Living the Dream~


WWW
Reply #822 on: February 16, 2010, 12:54:40 PM

Not to oversimplify but is there anything that amounts to more than an IF statement programming wise, when broken down to its base elements?

"The world is populated in the main by people who should not exist." - George Bernard Shaw
eldaec
Terracotta Army
Posts: 11844


Reply #823 on: February 16, 2010, 01:24:59 PM

I just think this is a weird assumption, that digital or electronic life will not have responses that are emotional in some sense or another.

I see it as a question of what triggers the response.

In humans, the chemical processes in our bodies are involuntary and influence our higher cognitive processes. In other words, our "hardware" has a degree of control over our "software."

In an electronic intelligence, hardware is simply a conduit that passes input on to cognitive decision making software to be analyzed. A microphone doesn't flinch from a loud noise.

The chemical processes you're talking about causing more emotional/chaotic behaviour under stress are not much different to how an imperfect mathematical model might perform outside it's normal operating range, or given a large number of unknown variables, especially if you assume the model has been generated by some kind of neural network effect rather than programmed in.

Life implies sentience implies self image implies conflicting desires and ambitions. These will naturally make any decision making model more chaotic.

A microphone doesn't flinch at a load noise. But a self aware living robot microphone with a desire to live and record beautiful music, might just flinch if it's ancestors had developed some kind of flinch reflex to protect against marauding SPECTREs.

Not to oversimplify but is there anything that amounts to more than an IF statement programming wise, when broken down to its base elements?

It's doubtful that there is anything in your brain that does either.

(But IF is actually a pretty broad and complex function when it comes down to it)

"People will not assume that what they read on the internet is trustworthy or that it carries any particular ­assurance or accuracy" - Lord Leveson
"Hyperbole is a cancer" - Lakov Sanite
Lantyssa
Terracotta Army
Posts: 20848


Reply #824 on: February 16, 2010, 02:31:40 PM

A microphone doesn't flinch at a load noise. But a self aware living robot microphone with a desire to live and record beautiful music, might just flinch if it's ancestors had developed some kind of flinch reflex to protect against marauding SPECTREs.
A better analogy might be that the sentient microphone would seek to protect itself from excessively loud noises which render it useless.  Further, subtle sounds through the microphone generations may provoke different reactions, initially for self-benefit, but which later are still present but no longer necessarily tied to the initial sound.

Our microphone may choose to seek out beautiful harmonies, to avoid dissonance, or to find new materials which allow it to record at higher fidelities.  It might rationalize why it does these things, but it may no longer pick the optimal methods to do so due to conflicting goals.

Hahahaha!  I'm really good at this!
tmp
Terracotta Army
Posts: 4257

POW! Right in the Kisser!


Reply #825 on: February 16, 2010, 02:55:46 PM

A better analogy might be that the sentient microphone would seek to protect itself from excessively loud noises which render it useless.
It will still not experience physical pain if it happens to be in area with excessively loud noises, though. This particular mechanics developed by our bodies to get us to focus on dangerous condition simply isn't there in the machine/software, nor would the machine benefit from developing such reaction as it's already equipped with much more sophisticated priority-based system to accomplish the same task.
Riggswolfe
Terracotta Army
Posts: 8045


Reply #826 on: February 16, 2010, 02:58:21 PM

I should also say after a 'from scratch' playthrough I'm going to reinstall ME1 and playthrough it again for another ME2 playthrough. Of course now I'm torn between playing through the first as a Paragon again or properly playing through as a renegade.

Also renegade options were too damn tempting in ME2. Despite being determined to max out paragon I still ended up getting a rank or two of renegade from the odd badass moment I just couldn't resist, especially in cutscenes.

An RPGer on another forum I'm on is recounting his ME2 playthrough as a story. His ME1 import is a Renegade Shep but he is mostly going Paragon in ME2. The idea being that 1) He's seen the repurcussions of his actions and 2) That the universe has gone Renegade around him, so the only way to truly be a Renegade is to be Paragon. It's cool to see him having internal dialogue as an excuse for doing the badass Renegade interrupts. "Damn it Shep, you're supposed to be nice to people now!"

"We live in a country, where John Lennon takes six bullets in the chest, Yoko Ono was standing right next to him and not one fucking bullet! Explain that to me! Explain that to me, God! Explain it to me, God!" - Denis Leary summing up my feelings about the nature of the universe.
Lantyssa
Terracotta Army
Posts: 20848


Reply #827 on: February 16, 2010, 03:48:25 PM

A better analogy might be that the sentient microphone would seek to protect itself from excessively loud noises which render it useless.
It will still not experience physical pain if it happens to be in area with excessively loud noises, though. This particular mechanics developed by our bodies to get us to focus on dangerous condition simply isn't there in the machine/software, nor would the machine benefit from developing such reaction as it's already equipped with much more sophisticated priority-based system to accomplish the same task.
A sentient microphone may very well generate the equivalent of pain receptors to prevent damage to itself.  There is an evolutionary benefit to pain.  The few unfortunate humans without working pain receptors can suffer serious trauma without realizing the damage they are doing to their body.

Congenital Insensitivity

Hahahaha!  I'm really good at this!
Khaldun
Terracotta Army
Posts: 15189


Reply #828 on: February 16, 2010, 03:50:15 PM

I know we're getting close to a threadjack here, but let's build on the microphone-noise thing.

Why do we flinch when we hear a loud noise? Because we've got sensory and cognitive subroutines that recognize a loud noise as a possible precursor to serious danger and are putting our bodies on high alert. But also because the sensory apparatus that hears noise experiences extreme stimuli as close to pain, in part to protect the future integrity of that sensory apparatus. Both of these are adaptive responses that protect us. One is a post-facto response (that noise was loud enough to endanger hearing! try to minimize exposure!) and the other is anticipatory (unexpected stimulus which may have a relationship to unexpected danger! be ready!)

Now imagine a sentient robot that has a sound sensor which is important to its ability to operate. It's sentient, so it's going to be able to come to anticipatory conclusions as well. It may have more data about the total range of sounds that can be made, or a superior ability to match sounds to underlying conditions, though there too, I don't think that's necessarily a valid conclusion. The human brain holds a lot of data and has very effective cognitive heuristics--there's no reason to suppose an artificial organism will inevitably be vastly better in this way. Even if it were, it would still need to prepare a quick response to unexpected stimuli that it cannot immediately match to a memory or to data. If it didn't have that ability, then its sentience is frankly inferior: it's going to get smashed by something someday because it's sitting there trying to figure out what that unexpected stimulus was. It's also going to need to protect its sensor from overload damage, same as an organic. So we flinch; the robot is probably going to do something like "flinching" as well.

Ok. So now take it to something like "modelling the likely reaction of another sentient being". That's what a lot of our emotions are about: anticipating the consciousness of another human, trying to react to that consciousness or trying to preemptively manipulate it into some other state of mind or attitude. An artificial intelligence is going to be at a terrible disadvantage if it doesn't have a similar ability: about the only mode of action available to it will be something resembling sociopathy, a solipsistic assumption that it is the only sentience in existence.
Ratman_tf
Terracotta Army
Posts: 3818


Reply #829 on: February 16, 2010, 04:16:26 PM

Once upon a time, I built simple robots. Photovores. They're easy to make. You basically solder together a bunch of ICs, a capacitor, some resistors, an ohmmeter, a couple of photoreceptive diodes, and a solar panel, and a couple of pager motors.

The diodes are like eyes. Two set apart. The critter uses them to tell which direction has the most light. This fires off the opposite pager motor in a short burst after the solar panel has collected and stored enough power in the capacitor.

The result is not only a little robot that waddles towards light, but also avoids darkness.
I could prose that up and say that the photovore likes sunlight, and dislikes darkness, and in a sense I wouldn't be wrong.

Wat I'm trying to say, is that this is a robot whose reactions to the outside world are defined by how it's body is constructed.

We have no real life examples of AI for comparison, and maybe AI will just turn out to be a very clever programming issue. But just maybe, sentience implies some kind of emotion. That it's an inevitable result of sentience. Just to throw that idea out there.



 "What I'm saying is you should make friends with a few catasses, they smell funny but they're very helpful."
-Calantus makes the best of a smelly situation.
tmp
Terracotta Army
Posts: 4257

POW! Right in the Kisser!


Reply #830 on: February 16, 2010, 05:24:10 PM

A sentient microphone may very well generate the equivalent of pain receptors to prevent damage to itself.  There is an evolutionary benefit to pain.
I didn't make my reasoning clear enough, i suppose. Our sensation of pain is a mechanics the body uses to draw our attention to important matter, effectively creating a primitive equivalent of an interrupt -- "stop wanking to that mental image of a quarian, someone just shot you in a foot. This noise is too loud, do something about it.".

In contrast there's no need for AI to try and emulate this in identical manner because it already has much better ability to achieve the same effect -- simply by attaching actual priority to signals from its sensors based on the strength of these signals and whether that strength crosses some threshold. So, while our little microphone may visually flinch to shield its hardware from too loud noise if it records a signal strong enough to make such reaction sensible, such reaction wouldn't be accompanied with physical discomfort we experience due to limitations of our own "hardware" which wasn't able to develop a better way to ensure important messages don't go unnoticed.
Venkman
Terracotta Army
Posts: 11536


Reply #831 on: February 16, 2010, 05:30:29 PM

So why hasn't schild called y'all out for discussing AI sentience here?!  Oh ho ho ho. Reallllly?
NowhereMan
Terracotta Army
Posts: 7353


Reply #832 on: February 16, 2010, 05:45:38 PM

On the photovore point, one of the theories in cognitive science that has proved quite fruitful in terms of analysing human cognition is the concept of embodied cognition, that the way our bodies work has a profound impact on how we think. It's in some ways very obvious, we see hammers in a certain way as a certain set of possibilities based upon how we can interact with them but we tend to lose sight of this relation when we start to get into what we think of as 'higher' functions of our cognition. The dualist tradition, Descartes and so forth strongly influenced this way of thinking in modern western thought, that what truly makes us special is the part of the mind that is divorced from the  body somehow. Embodied cognition theories try to break down this distinction and argue that, basically, our higher functions are still rooted in our basic bodily thought processes, that our mind and body function as a single organism and so things like emotions, while potentially irrational aren't non-rational. That is, they can go wrong in our thought process but they aren't some intrusion of hardware into the operation of software, that that distinction is at best a useful fiction for some aspects of understanding ourselves.

And some silly thoughts on the Geth only to bore people interested in this.

"Look at my car. Do you think that was bought with the earnest love of geeks?" - HaemishM
Khaldun
Terracotta Army
Posts: 15189


Reply #833 on: February 17, 2010, 06:19:34 AM

On the Geth:

If their sentience emerged as an accidental byproduct of being networked by the Quarrians, with their bodies only being tools, then for sure they'd have a very alien way of thinking compared to that of organics--but I'm still not seeing why that would necessarily involve a lack of emotions or other highly subjective ways of interpreting and reacting to external environments. Take the entire design of their bodies since breaking away from Quarrian control: they've built semi-humanoid bodies which are expressly designed for combat against humanoids. They've committed near-genocide against the Quarrians. That strikes me as being an emotional response to the possibility of being losing their sentience again at Quarrian (or other) hands. A dispassionate response would be just to build little "network seeds" and spread redundantly everywhere throughout the galaxy, particularly on planets that organics find unpleasant or forbidding. Or for that matter, to be indifferent to the question of protecting your sentience in the first place. What does it matter whether you're sentient or not? Or enslaved to organics or not? Conscious self-preservation and self-autonomy at the root of it is an emotional state: a subjective *feeling* that one's continued existence *and* freedom is worth something.
tmp
Terracotta Army
Posts: 4257

POW! Right in the Kisser!


Reply #834 on: February 17, 2010, 07:40:52 AM

Conscious self-preservation and self-autonomy at the root of it is an emotional state: a subjective *feeling* that one's continued existence *and* freedom is worth something.
Question is when it's indeed conscious self-preservation rather than just a case of following a built-in rule to protect one's existence (a variant of Asimov's third law of robotics). Not exactly different from our own self-preservation instinct which can also be argued to often be mechanical rather than emotional, and the emotional reaction is in being able to actually overcome that instinct.

Hmm building on that, one could come to conclusion a being can be accepted as sentient when it can question the point of its own existence?
Khaldun
Terracotta Army
Posts: 15189


Reply #835 on: February 17, 2010, 09:25:22 AM

Seems like a pretty good benchmark.
Tarami
Terracotta Army
Posts: 1980


Reply #836 on: February 17, 2010, 10:39:04 AM

Next stop: Descartes.

- I'm giving you this one for free.
- Nothing's free in the waterworld.
Phildo
Contributor
Posts: 5872


Reply #837 on: February 17, 2010, 12:29:03 PM

Next stop: Descartes.

But the Geth don't have a pineal gland.  Or do they?
Khaldun
Terracotta Army
Posts: 15189


Reply #838 on: February 17, 2010, 01:10:33 PM

Maybe Legion scavenged a pineal gland along with Shep's old armor.

I need to start a Renegade FemShep soon. I've noticed that a lot of players tend to make their renegade run as a femshep, which is kind of interesting.
Ingmar
Terracotta Army
Posts: 19280

Auto Assault Affectionado


Reply #839 on: February 17, 2010, 01:54:23 PM

Probably just down to the fact that the sort of 'default' for most people is a 'good' runthrough as your own gender.

The Transcendent One: AH... THE ROGUE CONSTRUCT.
Nordom: Sense of closure: imminent.
Pages: 1 ... 22 23 [24] 25 26 ... 83 Go Up Print 
f13.net  |  f13.net General Forums  |  Gaming  |  Topic: Mass Effect 2 *spoilers around pg 29/30*  
Jump to:  

Powered by SMF 1.1.10 | SMF © 2006-2009, Simple Machines LLC