Translating Emotion into Variables — The Three-Layer Emotion Model
Now it's time to design emotion in earnest. Having identified the limits of Affection+1, we need to build a system to replace it. I told AI: "I don't want a single affection variable. I want a variable system that represents human emotion more realistically."
And this is where AI really started showing its strength. Structuring emotion — decomposing something abstract into variables and layers — is exactly the kind of work AI excels at.
How Many Emotions Do We Need?
That was the first question. "How many emotion variables should we have?"
AI's answer was surprisingly cautious. "More isn't always better. Too many variables leads to overengineering." Then it proposed a crucial distinction:
The types of emotion you express can be many. But the state values the system directly tracks must be structurally organized.
What does that mean? Love, sadness, anger, arrogance, envy, jealousy, belief, trust, tension, conflict, fear, terror, admiration. You could list all of these as independent variables. But then the interaction rules explode. Which event changes what by how much? Can love and trust rise simultaneously? Under what conditions do envy and admiration diverge? If you have many variables without clear definitions, the game becomes even more fake.
So AI proposed separating emotion into layers.
The Three-Layer Model
Through iterative conversation with AI, this structure emerged:
Layer A: Core Disposition / Drives — The character's innate values. They don't change easily. Set at game start, they barely shift during play.
Need for Affection (need_for_affection)
Fear of Rejection (fear_of_rejection)
Pride (pride)
Insecurity (insecurity)
This is the character's "personality." The fundamental reason why this person acts the way they do.
Layer B: Current Emotional State — Fluctuates in response to scenes and events. Momentary and fluid.
Affection (affection)
Anger (anger)
Sadness (sadness)
Jealousy (jealousy)
Admiration (admiration)
Fear (fear)
This is what the character feels "right now, in this moment." It can change from scene to scene.
Layer C: Relationship State — The accumulated result with the other person. Changes relatively slowly.
Trust (trust)
Reliance (reliance)
Intimacy (intimacy)
Tension (tension)
Resentment (resentment)
Respect (respect)
This is "the state of the relationship built up so far." A single event won't dramatically alter it.
Why Split into Three?
AI explained the importance of this separation with examples.
"High jealousy and high trust at the same time" becomes possible. You like and trust someone but still feel jealous. That's realistic. In an affection system, this is impossible. If jealousy goes up, affection goes down — that's the only way to express it.
Another example. "High affection but low trust." Attracted but can't rely on them. "High trust but low affection." Precious, but not romantic. A single score can't show this difference, but with three layers, it's expressed naturally.
And the most important thing: Love is nowhere. There is no love variable in the system. Instead, love "emerges" as a combination of Layer C variables.
Trust >= 70
Intimacy >= 60
Fear of Loss >= 30
Repeatedly chose the other person >= 5 times
When these conditions are met, the player feels "this is love." Not love += 1, but love appearing as the result of accumulated conditions. Design by emergence.
Every Emotion Needs a Definition
One thing AI insisted on strongly: "If you've created a variable, you must write a definition document."
belief: The degree to which you assume the other person's words are true
trust: The degree to which you can entrust yourself to the other person
envy: Wishing you had what the other person has
jealousy: Anxiety that the other person might be taken by someone else
tension: Discomfort that hasn't surfaced yet
conflict: A clash that has already become visible
fear: Anxiety that produces avoidance or withdrawal
admiration: The feeling of wanting to resemble or looking up to someone
Without these definitions, the writer, designer, and coder all end up working in different directions. If "trust goes up" means different things to different people, the system loses consistency.
This resonated immediately because of my developer experience. It's the same problem as naming variables ambiguously in code — confusion down the road. Emotion variables are no different. Names alone aren't enough; you need definitions.
Event Reaction Functions
Listing variables is just the beginning. What really matters is "which event changes which variable by how much."
AI proposed this structure: don't write every change rule manually — templatize them by event type.
def apply_reaction(kind):
if kind == "kept_promise":
em["trust"] += 8
em["belief"] += 5
em["affection"] += 3
elif kind == "public_humiliation":
em["anger"] += 10
em["sadness"] += 6
em["trust"] -= 7
em["conflict"] += 8
elif kind == "saw_with_rival":
em["jealousy"] += 9
em["fear"] += 4
em["affection"] += 2
A "kept promise" event raises trust by 8, belief by 5, affection by 3. A "public humiliation" event raises anger by 10 and drops trust by 7. Defining emotion changes at the event level makes the system manageable.
And one more thing: Don't calculate every emotion every scene — only pull out the "active emotions." Even if the system has 20 emotions, any given scene only uses 2 to 4.
Reunion scene: longing, wariness, trust
Betrayal scene: anger, sadness, conflict
Rival appears: jealousy, admiration, anxiety
Limiting active emotions per scene preserves depth while keeping complexity under control.
Why This Design Is Possible
Honestly, if I'd tried to design a system this complex alone, I would have given up halfway through. Defining relationships between variables, tuning change amounts per event, building test cases — the work is massive.
It's possible because AI is here. Proposing variable structures, writing definition documents, drafting event reaction functions, simulating whether the balance holds. For this kind of repetitive, structural work, AI is fast and precise.
Of course, the final judgment has to be human. "Is trust going up by 8 when a promise is kept correct? Wouldn't 5 be more appropriate?" That kind of intuitive tuning is beyond AI. But rapidly producing drafts and immediately reflecting revisions — that's AI's strength.
In the next entry, we dig deeper into the core of this three-layer model. Designing love without directly incrementing it.
Next: Designing Love Without Directly Incrementing It — Love Is the Result of Cumulative Interpretation
댓글
댓글 쓰기