Machines

Hey!! I’m Clara, and next to music I’m deeply interested in the mechanics of cognitive processes in machines. Originally starting off with a degree in Business, I soon found myself drawn to those subjects of my program intersecting with Psychology and Math.

Following an interest in human nature, I added a degree in Psychology. Soon my main interest became Cognitive and Biological Psychology, leading to an internship, research assistant job and thesis project with the group for Cognitive Neuroscience at the Max Planck Institute for Human Development (CNARC). What I loved about Cognitive Psychology was breaking cognitive processes down into simple, palpable mechanisms. I was fascinated by was the idea of somehow creating cognitive processes. 

Ultimately I wrote my thesis “On the Evolution of Mental Representations underpinning Transitive Inference”, a topic at the intersection of Psychology, Math and Computer Science, with which I contributed to a publication on transitive inference learning. Not understanding most of the computer science literature, it cemented my wish to somehow “build” cognitive systems.

After my graduation I worked as a student worker at Universal Music, where I picked up VBA, which lead to my freelance activity as VBA developer, a year later when I started my master, M.Sc. Cognitive Systems. After a crash course in natural language processing and massively uplevelling my coding skills I was quickly drawn to Machine Learning, deepdiving into the very basics of cognitive systems such as perceptrons and neural networks, slowly adding up the pieces to (almost) understanding attention and transformers.

2024, I completed a project on benchmarking multimodal LLMs, investigating how their architecture and training data affect performance. That same year I finished my master’s degree and worked at the German Institute for Artificial Intelligence developing and evaluating methods to perform relation extraction in the biomedical domain with transformer-based language models more resource efficiently.