During my high school years, I began working as a developer at Viscando AB, which delivers 3D and AI solutions for traffic analysis. There, I worked extensively with front-end/back-end development and design, as well as machine learning. Over time, my areas of responsibility grew and I was given responsibility for the development of projects that I also helped put into production. Among other things, I developed:
SSO solution for authentication and authorization in a distributed system with internal and external applications as well as edge units without guaranteed internet access.
An annotation system for annotating video and images that optimized for fast annotation of large amounts of video with accompanying evaluation tools that continuously analyzed the performance of various ML models over time, versions, and meta-parameters.
Development and user study of traffic control systems that required very fast iterations and high quality assurance.
Technologies used: Node, GraphQL, React, PostgreSQL, Python, Go
Centre for Collective Action Research (CeCAR) at University of Gothenburg
As an assistant at CeCAR, I drove the development and maintenance of a classification system that analyzed
Europe's banks in relation to the EU's taxonomy. The developed system included data collection, annotation tools,
and classification of the banks' annual reports.
Technologies used: NLP (zero-shot classification using NLI-model), Node, PostgreSQL, Pythons
Cadence is an iOS app which I created that analyzes training data to give its users in over 50 countries the ability to visualize their training in new ways. The app has involved new experiences in both visualization of geographical data and customer support.
Cadence uses HealthKit which allows the app to stay transparent regarding privacy as all data is stored on-device. The app was originally written in Swift using UIKit but has since been updated to use SwiftUI.
Betygen is my oldest full-stack application which allows students to find admission grades to schools in Sweden.
The site averages 2.5k unique visitors per week, is hosed on Heroku and written using React/Node/PostgreSQL. It is not based on any existing APIs but instead uses a data-pipeline to convert archive documents from municipalities to construct the dataset.