The human behind the code: The intersection of data, emotion, identity and healing – what is metacognitive narrative?

Jordan E. Clark
By April Eberhardt The Black Lens

Jordan E. Clark comes from humble beginnings in Wilson Creek, Washington, a journey that set him on his path to leading a pre-seed AI venture in Boston, Clark’s journey weaves together data science, education, public service, and an often-overlooked tool for self-transformation: metacognitive narrative.

Raised in predominantly white rural towns in Washington state and later Spokane (a graduate of Lewis and Clark High School), Clark didn’t initially envision a career in science or technology.

“I told my college counselor I didn’t want to do anything with math or science,” he said with a laugh. “I just said, ‘I want to study Black people.’”

That declaration led him to major in African American Studies and Political Science at Northeastern University in Boston. What followed was a nontraditional path through the White House, Teach for America, and education administration in Washington, D.C.

“So I was pre-law – I was going to go to law school and be Obama, basically,” Jordan said, laughing. “I worked at the White House. I worked for Obama. That was the path I was on. But then I realized – I just wanted to learn the skills of a lawyer. I didn’t actually want to sit in a room reading all day. So I pivoted. I became a teacher through Teach for America.”

But it was a personal crisis that ultimately brought Clark to the world of AI and computational data science.

“I went on medical leave. I had a breakdown,” he said candidly. “That’s when I was diagnosed with PTSD from chronic childhood trauma. Things started to make more sense.”

As he sought healing, he immersed himself in graduate work in urban informatics – a field that uses big data to solve problems in city planning and public policy.

“It’s like taking data from every corner of a city – trash collection, rat sightings, anything – and asking, how can we predict and improve life for people?” he said.

At the same time, he was engaging in metacognitive narrative psychotherapy – a clinical term for writing down what you’re thinking and why. It became both a personal practice and professional superpower. “It’s not just ‘I felt sad today,’ ” he says. “It’s: Why did I feel sad? What triggered it? Oh, that microaggression in the meeting – that’s what made me shut down.”

Clark specializes in metacognitive narrative: a methodical, structured form of journaling that combines self-reflection with analysis.

“I’ve been doing this for 11 years, but it took me six to realize that’s what I was doing,” he said. “Now I understand why people journal. Especially women. It gives you a sense of control in a world that often doesn’t make sense.”

This has become a catalyst for Clark to expand the value and impact of metacognitive journaling “Most people don’t say the thing out loud. I do. That’s the last step for me – self-advocacy.” Drawing from his upbringing by white women and his experience navigating predominantly white institutions, Clark has learned to disarm, engage, and disrupt systems with clarity and courage.

Today, Clark is pioneering cognitive and emotional AI, merging machine learning with the nuances of human behavior. His technical title – computational data scientist – doesn’t quite capture the breadth of his work, which lives at the intersection of data, emotion, identity, and healing. And at the core of it all is the idea that our thoughts, when examined with intention, can be tools for transformation.

“Metacognitive narrative gave me a way to survive,” he said. “Now it’s how I lead.”

In the rapidly evolving world of artificial intelligence, conversations about ethics, bias and representation are finally making their way into the mainstream. But for Clark, the question isn’t whether AI is biased. It’s how deeply embedded racial identity is in the very architecture of the systems we’re building.

“Most AI is built by white men,” Clark said bluntly. “And so it reflects their worldview. What does a smile look like? To them, it’s teeth, it’s facial expression. But to a Black woman? It might be a shift in energy, a vibe. She’s smiling with her being, not her mouth. AI doesn’t recognize that – yet.”

Clark is developing a VR-based, emotionally intelligent AI system rooted in something radically human: lived experience. His work blends cutting-edge machine learning, metacognitive narrative and deeply personal cultural history. The result? An adaptive AI that doesn’t just mimic humanity – it learns humanity through the lens of identity, trauma, and resilience.

Born in rural Wilson Creek and raised by white women as a biracial youth, Clark’s early encounters with identity were complex.

“People used to ask me, ‘What are you?’” he said. “I used to say, ‘I’m a person.’ But I didn’t know then that was a microaggression. Now I do – and now I get to choose how I respond.”

This awareness isn’t just personal – it’s foundational to how he believes AI must be trained.

“The question ‘What are you?’ is about categorization. People want to know where to file you in their brains. That’s how humans process the world – it’s efficient. But in doing so, we also perpetuate harmful stereotypes.”

In AI, those categorizations become even more rigid.

“If a Black person isn’t smiling, AI might flag them as neutral or even angry,” Clark said. “But that’s not how we experience joy. So when we train these systems without understanding culture, we’re not just leaving people out – we’re misrepresenting them.”

He is currently building a metahuman avatar of himself – a digital twin that serves as a tool for cultural education and emotional feedback. Using brain-computer interfaces like the Muse headset, which reads brainwave activity, the avatar can respond in real time to a user’s emotions, tone and intent.

“If you say something ignorant, the avatar might ‘punch’ you – in VR,” he explained. “Not physically, of course. But enough to jolt you. To create presence. So you feel the impact of your words, even in a virtual space.”

This isn’t about fear – it’s about education.

“We’re teaching cultural competence, not through lectures or PowerPoints, but through experience,” he said. “You don’t forget how it felt when you made that mistake.”

Clark believes that leaving identity out of AI development is not only unethical – it’s dangerous.

“We’re creating systems that are going to run our cities, make hiring decisions, police our streets, and educate our children,” Clark said. “If those systems don’t understand race, power, and trauma, they’ll just replicate the same oppressive structures we’re trying to dismantle.”

His work asks a bold question: What if we trained AI the same way we train empathy?

By embedding his own metacognitive narratives into AI training data, Clark is creating a memoir – not just on paper, but in virtual reality.

“There aren’t enough words to describe my life,” he said. “You’d never fully understand it from reading alone.” Using the virtual technology empowers Clark to show, not tell.

And he’s not just showing his own story. He’s encoding the voices of others – Black women, queer folks, people of color, whose emotions and experiences are often flattened in mainstream datasets.

“A photorealistic avatar isn’t enough,” he said. “Does that avatar know how to feel like me? Does it know the weight of my mother saying, ‘I’ll punch you in the throat’ – not as violence, but as cultural code? That’s what I’m trying to teach the machine.”

Clark’s vision of AI is fundamentally different: an ecosystem that doesn’t just tolerate difference – it requires it.

“In tech, bias is treated like a bug. But identity isn’t a bug – it’s the blueprint,” he said. “We need to stop pretending we’re building neutral systems. We’re not. We’re building human systems. And if we don’t build them with the full humanity of Blackness, of queerness, of intersectionality in mind, we’re failing.”

As the future of AI unfolds, Jordan E. Clark stands at the intersection of code and consciousness, making one thing clear: racial identity isn’t a side note in the story of artificial intelligence. It’s the soul of it.

“You don’t just train AI to understand faces – you train it to understand feeling,” Clark said. “And feeling is cultural. If we get that wrong, we get everything wrong.”