Education: Master of Science, Computer Science (1993)
Work experience
Independent software engineer ⋅ 2016-present
2019-present: Ruter AS
I work as solution architect, tech lead and full-stack developer on a team that develops and maintains an event driven
microservices architecture for processing, monitoring and visualising live and historic public transport data.
- Languages: Kotlin, Java, TypeScript, Python, SQL
- Frameworks & libraries: Spring Boot, Kafka Streams, Reactive Streams (Reactor), React, Hibernate/JPA, Apollo GraphQL, JetBrains Exposed, JUnit
- Persistence: PostgreSQL, Snowflake, Elasticsearch, Cassandra, S3
- Technologies: Amazon AWS, Kubernetes, Docker, Linux, Kafka, REST, WebSocket, GraphQL
- Tools: IntelliJ IDEA, Gradle, Git, GitLab CI/CD, Terraform, Datadog, Prometheus, Grafana, JIRA, Confluence
- Methods: DevOps, Scrum, Kanban, Domain Driven Design, TDD
2018-2019: Telenor Digital
I worked with the Digital Distribution Technologies team (since spun off as Millom),
which develops, maintains and operates a set of APIs that enable Google, Facebook and other service providers to
deliver services directly to subscribers across Telenor business units in Asia and Europe.
In particular, I developed an API in partnership with Google that bridges from REST via gRPC over to Telenor’s
Diameter/SCTP peering network, to facilitate subscriber authentication and phone number verification over EAP-AKA
directly from the subscriber’s handset.
- Languages: Java, Python, Groovy, HCL, Bash, C, Go
- Frameworks & libraries: JAX-RS, Protocol Buffers, jDiameter, JUnit
- Technologies: Amazon AWS, Linux, Docker/Kubernetes, Elasticsearch, gRPC, Diameter/SCTP, Nginx, OAuth2
- Tools: Intellij IDEA, Gradle, Git, Wireshark, Terraform, Jenkins, Loggly, New Relic, Kafka, JIRA, Confluence
- Methods: DevOps, Scrum, TDD
2017-2018: Telia Norge
In brief
- Full-stack developer (web, app, API, backend, DB)
- Technical architect & advisor
- Advisor to project management
- Systems operations
In detail »
«
I worked with the OCMC Engineering group, which was responsible for developing, maintaining and operating web & customer
portals, mobile apps and internal CRM systems for Telia’s OneCall and MyCall
brands. Key accomplishments:
- Chief architect, lead developer and advisor to project management on
MyCall Money Transfer, an
offering developed in partnership with Western Union: I architected the overall solution and was the lead
(and mostly sole) developer, covering frontend (PHP/Laravel, JavaScript, CSS/Sass), user authentication and
authorisation (OAuth2), service layer (REST, PHP/Laravel, OCI8), data modeling and stored procedures (Oracle, PL/SQL),
and integration with Western Union (OAuth2, REST).
- Chief architect, developer, lead deployment engineer and advisor to project management on new web portal, OneCall:
I helped enforce improved coding standards (PHP/Laravel), built support for JWT authentication in the
service and database layers (JWT, OAuth2, REST, PHP/Laravel, PL/SQL, Oracle), architected and orchestrated the
integration of the new web portal with the existing customer portal (vanilla PHP, Smarty), set up the new
production servers (RHEL 7.4, Apache 2.4, PHP 7.2, OCI8 and other dependencies) and orchestrated the deployment
of web & customer portals, service layer and authentication framework in time for launch.
- Lead planner and architectural advisor on new customer portal, OneCall: I created JIRA epics and user stories that
pulled fragmented Sketch designs into a coherent whole, met with key developers to create an estimate for the
overall development effort and used these to convince the management team to revise the proposed timeline and to
build the new portal on React instead of PHP + vanilla JS/jQuery, to better align with Telia’s overall technology
stack.
- Lead deployment engineer for new production and staging environments, OneCall & MyCall: I set up new production
and staging servers (CentOS 7.5, Apache 2.4, PHP 7.2, OCI8 and other dependencies), deployed, verified and – if
required – made compatibility updates to all production sites (as well as key staging sites) before go-live.
- Cross-brand improvement activities: Among other things, I worked to reduce functional overlap and minimise
gaps in the service layers by better aligning them with
RESTful principles; to improve code
quality by elevating the PR approval process from a formality to a fruitful dialogue between peers, and by
incorporating existing test suites into CI builds; and to reduce setup costs of doing local development by
creating a set of lean, Xdebug-enabled Docker development containers.
- Cross-brand feature development and maintenance: Additionally, I did general troubleshooting, bugfixing and
development of small to medium features and offerings across web, mobile apps, service layers, and databases.
- Languages: PHP, HTML, CSS, JavaScript, PL/SQL, SQL, TypeScript
- Frameworks & libraries: Laravel, Symfony, Smarty, GuzzleHttp, jQuery, Node.js, Vue.js, React, Ionic, Angular
- Infrastructure: Linux, Apache, PHP-FPM, OCI8, REST
- Database: Oracle
- Tools: PhpStorm, Xdebug, Webpack, Gulp, Toad for Oracle, SQL Developer, Git/BitBucket Server, Bamboo, Docker,
Vagrant, JIRA, Confluence, Vim, Xcode, iTunes Connect
- Standards & formats: HTTP, OAuth2, JWT, JSON, CORS
- Methods: DevOps, Agile, Kanban, TDD
In brief
- Chief technical architect
- Full-stack developer (app, web, API, backend, DB)
- Systems operations
In detail »
«
I worked as chief technical architect and full stack developer for a Pokémon Go inspired augmented reality concept
for product marketing. Key accomplishments:
- iOS* app for augmented reality enhanced product marketing: The app fetches campaign locations based on the user’s
current location and lays them out in an interactive, MapBox based map, launching an augmented reality game
(implemented in JavaScript*, bridged via the Wikitude iOS SDK) when the user gets close enough to a location,
incrementally persisting the current game status back to the backend, accumulating points and prizes.
- Serverless NoSQL REST API: I defined the data model, set up separate development and production tables
on AWS DynamoDB, set up staged serverless AWS Lambda instances and wrote a versioned Lambda handler in Python
that handles basic CRUD operations for individual data entities, and also supports optional sideloading of partial or
full object graphs rooted at the requested entity or entities. To tie it all together, I documented the
corresponding REST endpoints in Swagger and shared them with the team via Apiary, set up staged, endpoint
agnostic AWS API Gateway instances and hooked them up to the corresponding AWS Lambda handlers.
- Single-page campaign management console: I wrote a management console in Angular 2+, hosted on AWS
S3, that over the same REST API that the app uses lets campaign managers create, update and delete campaigns
and locations available for gameplay.
- Utilities and libraries: I wrote a suite of Python and Bash scripts to manage the DynamoDB and Lambda handler
instances and deploy to AWS over the AWS CLI, and in my own time I wrote and open sourced a Swift
JSON caching library that the app utilises to cache data locally (in part
based on the replication engine I built for Origon; see below), as well as a
random test data generator for DynamoDB tables that I used to generate
dummy campaign data during development.
* An identical Android app, as well as the shared AR world, were implemented by another team member.
- Languages: Swift, Objective-C, Python, TypeScript, HTML, CSS
- Frameworks & libraries: Cocoa Touch, Core Data, MapBox, JSONCache,
Wikitude AR, Angular, AWS CLI
- Infrastructure: iOS, Android, REST, Amazon AWS (Lambda, DynamoDB, API Gateway, S3, CloudWatch)
- Database: AWS DynamoDB (NoSQL)
- Tools: Git/GitLab, Xcode, Atom, Android Studio, QuickDBD, AWS CLI, AWS Toolkit for Eclipse, emulambda, Swagger,
Apiary, iTunes Connect, TestFlight
- Standards & formats: HTTP, JSON, CORS
Origon ⋅ 2012-2016
In brief
- Original idea, all UX, design & development*
In detail »
«
The unifying idea behind this app (iOS only) for shared contact lists (full feature set) is
per list data replication: I maintain and give you access to my contact information, and you maintain and give me
access to yours. This way, our shared contact lists stay up to date.
To enable users to mirror each other’s contact information, the app and the backend together constitute a replication
framework that seemlessly persists changes from indvidual users and pushes those same changes back out to linked
users.
The app is implemented in Objective-C (code), while the backend, implemented in
Java (code), runs serverlessly on Google App Engine and utilises
RESTEasy for the replication API and Objectify for object graph persistence in Google Cloud Datastore.
- Languages: Objective-C, Java
- Frameworks & libraries: Cocoa Touch, Core Data, MapKit, Core Location, RESTEasy, JAX-RS, Jackson, Objectify
- Infrastructure: iOS, REST, Google App Engine
- Database: Google Cloud Datastore (NoSQL)
- Tools: Git, Xcode, Eclipse, Google Plugin for Eclipse, Maven, iTunes Connect, TestFlight
- Standards & formats: HTTP, JSON
* I no longer maintain Origon, but it’s still
available in the App Store, which it will
continue to be until it stops working.
Microsoft Development Center Norway ⋅ 2008-2012
FAST was acquired by Microsoft in 2008, and after a transition period under the name FAST, a Microsoft
Subsidiary, FAST became Microsoft Development Center Norway, an integral part of the Microsoft Office organisation.
In brief
- Feature owner for topology management, scaling, backup & restore for Search in SharePoint 2013
- Development lead for for the Enterprise Search core in SharePoint 2010
2010-2012: Senior Program Manager
In detail »
«
For Search in SharePoint 2013, FAST’s next generation search, which had been under development for a few years
prior to the acquisition, was ported from Java to .NET and C# and adapted as a SharePoint Service Application, to
replace the search options built for SharePoint 2010. I was feature owner for topology management and scaling
(provisioning, enablement, disablement and removal of search nodes in the distributed search architecture, and
distribution of search components among nodes) and backup & restore of distributed search indices across
topologies.
For this, I worked with the local development and test disciplines, and in addition managed a remote team of 10-15
developers and testers in Colombo, Sri Lanka.
2008-2010: Senior Development Lead
In detail »
«
The Office 2010 wave was nearly halfway to completion when Microsoft acquired FAST, which effectively meant that we had
less than two years to integrate organisationally, identify and remediate any non-compliant use of open source
software, and develop and ship an integrated search solution as part of SharePoint 2010.
I lead a team of highly skilled and fiercely independent developers who had been working on the search core since
the very early days of FAST. They were apprehensive about the acquisistion and what it would mean for them and their
code, but they put their doubts aside and did their utmost to make the the search core pluggable as a SharePoint
Service Application.
In the period leading up to code freeze ahead of release, I was part of the management group that conducted daily bug
triages for the overall FAST team. I also defended bugfixes in the Office-wide bug triage in Redmond, among
which was a group of fixes for a ship/no-ship performance issue that it had taken us weeks of profiling and analysis
to narrow down.
- Languages: C#, C++, Java, Python
- Framework: .NET
- Infrastructure: Windows Server, SharePoint, .NET, WCF/SOAP, Linux, TCP/IP, IP sockets
- Database: SQL Server
- Standards & formats: HTTP, SOAP, WSDL, XML
- Tools: Git, Visual Studio, PowerShell, SharePoint, Product Studio (internal bug tracker), Remote Desktop
- Methods: Scrum, TDD
Fast Search & Transfer (FAST) ⋅ 2005-2008
I first returned to FAST in 2004 as a hired consultant from Accenture, working on customer projects with FAST’s Global
Services organisation. Then in 2005 I formally rejoined FAST.
In brief
- Created collaboration and deployment tools for efficient customer installations of ESP (Enterprise Search Platform)
- Established guidelines and best practices for streamlined and repeatable customer installations of ESP
- Built tailored search solutions for customers across Europe
2007-2008: Director, Solutions Architecture Center EMEA
In detail »
«
In addition to following up on ongoing customer projects and driving the work to establish guidelines and best
practices for streamlined and repeatable customer installations, I created ESPedia, a Wikipedia-like
collection of technical documents ranging from configuration HOWTOs, via the aforementioned guidelines and best
practices, to the inner workings of ESP, written and maintained by myself and other FAST engineers, and with
read-only access for certified partners. In addition, I developed an Apache Maven plugin to automate as much as
possible of the manual, error prone and often repetitive work that was involved in a typical customer installation
of ESP.
2005-2007: Senior Solutions Architect, Global Services (GS)
In detail »
«
ESP was a scalable, distributed search offering with many moving parts running on anything from a single to
clusters of ten or more servers. I worked with customers across Europe to build tailored search solutions,
including the main Yellow Pages sites in France, Austria, the Baltics, and Norway; information & analytics companies
(Reed Elsevier, ProQuest), major newspapers (The Financial Times), major retailers (Carrefour), classifieds (Loot),
academic institutions (NTNU), as well as government and municipal organisations (UK Department for Work and Pensions,
Hereford County Council).
Together with a select group of experienced engineers, I was handpicked to join a newly formed Solutions
Architecture Center within Global Services, whose mandate it was to establish best practices and develop
guidelines, support tools and reusable project templates in order to make projects more repeatable from customer to
customer.
- Product: FAST Enterprise Search Platform (ESP)
- Languages: C++, Python, Java
- Infrastructure: Linux, Solaris, AIX, Windows, TCP/IP, IP sockets, RAID, SAN, NAS
- Tools: CVS, Emacs, Vim, VNC, SSH tunneling, Confluence, JIRA, Excel
Accenture ⋅ 2001-2005
I’m working my way back in time, please come back later :)