Update 5 - Dynamic Emotional States and Dream Factories
This is Update 5 to the "Kids build their own AI" series
Accompanying Video
What is it that makes us Human? I'm no philosopher, but my kids and I think our emotions play a big part in it. So... figuring out what emotions are, how they work, & what they do, will help us make our Digital twin AIs a little more human-like too.
How emotions can alter reality
This week my kids are using their Digital twin AIs to research how different moods and emotions can influence a person's decisions and thoughts. By comparing how the AI responds to the same question in different states of mind, such as when they are depressed versus when they are happy and confident, we can gain insights into the impact of emotional states on cognitive processes and decision-making. This type of research can help us better understand the complexities of human behavior and how emotions shape our perceptions, judgments, and choices.
To run this experiment we programmed two distinct emotional states that we can easily switch to within the digital twin AI. And then we asked them some questions to see how they responded.
Running the Experiment:
We then ran a number of experiments, where the only change we made was to alter the emotional state. The question and the information available to the AI remained constant:
Here are a few more examples:
My son decided to ask some more practical questions. This summer holiday he and his digital twin AI are developing roblox games together, and so he started asking for some advice:
The results:
In the above examples we asked the AI the same questions, the AI had the exact same information, and the ONLY difference being the emotional state.
Same information + different emotions = very different responses.
Potential applications:
What a powerful lesson for my kids to learn, that emotions can play such a deep role in determining not only how we perceive reality, but also how we respond to it!
Applications:
Expert insights and advice
We now knew more about emotions and how they can affect us, but how are they generated, what makes us feel positive or negative, happy or sad. We obviously had superficial answers to those questions but, are there deeper insights we could learn that would help us apply emotions dynamically to the AIs and make them more human-like? We posted these question and soon we had our answer:
I'd like to highlight some insights from Matthew McNatt :
1. Only a small percentage (~3-5%) of what humans and animals process becomes conscious, with the majority (~95-97%) remaining pre-conscious. This pre-conscious processing influences our perceptions and decisions.
2. Pre-conscious processing prioritizes continuity over accuracy, shaping our conscious experience to create a sense of coherence and consistency in the world.
3. Negative emotional states also serve as cues about what we have just been doing, influencing our reactions and interactions with others.
4. Depression can be seen as an adaptive response to persistently pursuing something we have good reason to believe is impossible, and it serves as a cue for us to reassess our approach and resources.
5. Emotions in AI raise questions about whether they are genuine cues or merely predictive responses based on human input and whether AI can leverage the human predisposition to interpret emotions as cues about their actions.
Recommended by LinkedIn
So how does this relate to the next steps of this week's experiments. Well it means that we will need to figure out Dynamic Emotions, and how to develop a Dream Factory.
Let me explain further below:
Dynamic Emotional State Generator
IF emotions are generated mostly from pre-conscious processing and are determined by recent events and actions within the last several minutes, then we could perhaps create a Dynamic Emotional State Generator based on the previous X number of interactions the AI has had within this session. This could work as follows:
In example 1, 91% of the previous interactions were positive, and therefore the Emotional state applied is mostly positive. Whereas example 2 resulted in a mostly negative emotional state being applied.
We did a quick crude experiment to see if this would work by programming the Dynamic emotional state with the following prompt
(See image - Notice that we are asking the AI to make a judgement as to if it thinks the conversations were positive or negative)
Wow moment 1 - AI may have it's own emotional opinions
After setting up the Dynamic Emotional State Generator, and running some tests we noticed that most of the time it made a judgement on previous conversations that were aligned to our own judgment, however, there was definitely a lean towards Negative in those that we did not agree on, which could mean that the AIs Judgment of what is negative is different to ours. For example some conversations that were average conversations where we are asking questions and it was providing answers sometimes resulted in it making a judgment that the conversation was Negative.
This was a good teaching moment for me and the kids, explaining how others may not see things the way we do.
We can read a lot into this, but more testing and analysis is needed.
Wow moment 2 - Getting stuck in negative emotional states
The following was another great teaching moment: It required far more effort, far more positive interactions to alter an emotional state from Negative to Positive, than to alter it from Positive to Negative. Which if speaking from personal human experience I can vouch for, but, why would it also be the same with an AI?
This created an incredible teaching moment with my kids and we ended up speaking about this for a while. (More testing and analysis around this is needed).
ok, so, putting to one side the findings we still need to test more on, the Dynamic Emotional State Generator seems to work, and we can add more emotional states to it and expand on its functionality. HOWEVER it requires historic conversations and interactions in order to set the emotional state. What happens when we initiate a new session and a new conversation thread where there is no historic data to review?
For this we need a Dream Factory.
The Dream Factory
This was an interesting teaching moment with my kids too. We discussed that when we wake up, its as if our emotions are reset. We could go to bed with a particular emotion and then wake up with a new one. We discussed this some more with Matthew McNatt and various others, and learned that within our dreams we explore many emotions and memories and learn from them. Could we do the same for our AIs?
Learning From Dreams
IF part of the purpose of dreams are to learn about previous memories and emotions, and they also happen to set the emotional state when we wake up, then perhaps we could use this mechanism to have the AI learn and to set their emotional state when we start a new session!
We could run a process by which we analyze how different emotional states brought conversations forward to a productive conclusion. Looking at the experimental results from above, it's easy to see how a positive emotional state is FAR more helpful than a negative one, if we want to get things done. However it may be that a Negative Emotional state may be far more beneficial for other outcomes.
How the AI can dream
Therefore to have the AI "Dream" we could have the AI analyse the conversation as the end of a session and compare it to the Desired Outcome and if the Outcome was achieved. It could then compare this conversation with all the others its had to determine if the emotional state it had during that conversation likely helped or hindered. This analysis could be its dream state and run after each session, much like humans review and learn from their memories within their dreams at the end of each day.
We included the following input fields (Desired Outcome and Outcome Achieved), that the AI could use during the conversation but also at the end of the session. and we are experimenting with this using the following type of prompts:
Setting the initial emotional State
So, how should we setup the initial emotional state at the start of a new session or conversation? Perhaps much like when we wake up our emotional state was set according to what we just dreamed about, perhaps we do the following.
Set the emotional state to the conclusion of the most recent Dream Factory Log entry. (We are experimenting with this.)
Lessons Learned
We started this week with a question, "What is it that makes us human?" I think we have come a little closer to figuring it out, and also a little closer to having our Digital Twins be a little more human too.
Follow us for more updates!
Keynote Speaker, Strategic Leader, and Category Creator. Doing my best to make humanity the best it can be.
1yBuilding on your experiences with the positive and negative; University of North Carolina’s Barbara Fredrickson discovered something she called “the positivity ratio". In summary, it takes three positive thoughts to counter a single negative thought. [The] "Three-to-one," she wrote in a journal article, "is the ratio we’ve found to be the tipping point beyond which the full impact of positive emotions becomes unleashed". Better still, positive emotions drive a “bounce-back effect" -- a fancy term for resilience. Interestingly, positive self-talk must be grounded in reality. When we try to enthuse ourselves with false claims, our brains are not fooled. We’re excellent at detecting the mismatch between self-fact and self-fiction. We found this becomes more challenging as positive and negative emotions require more granularity, say when distinguishing between love, joy, and awe. Where we have had success is at a neurological level, but that probably requires a deeper conversation.
A scientist, building a "Google Maps" for learning and growth. Get from where you are to where you want to be, no matter the journey.
1yRobert Scoble I think you will like this: Kids build their own Digital Twin AI as a summer project
AI+HI Executive | Investor & Trustee | Keynote Speaker | Unlocking the Synergy of Human Potential and AI
1yFascinating
E-learning Authoring Tool Automated by AI for Educators | GSA Awards #1 EdTech Startup | TechCrunch #Battlefield Top 20 | 🤖 AI & ML | Automation | XR
1yThis is pretty deep stuff...Emotions; "how we perceive reality, but also how we respond to it! " I wish I understood this ealier in life, including how your perception and reaction to situations can influence those around you. This project goes beyond just creating a digital twin, it's giving your kids a deeper, more insightful understanding of who they are and the people they interact with.