Tech Coffee: Azure Digital Twins + HoloLens: Powering the Next Generation of IoT

TechCoffeeLogoTech Coffee is a series that aims to introduce you to a new technology in less than 20 minutes. In this post, we will get an introduction to digital twins, and some of the terms and technologies needed to create them. So grab a cup of coffee and get started!


Digital Twin?
There is a lot of discussions and talks around digital twin solutions, where fleets of devices produce a massive amount of data, all communicated to the cloud for analysis and processing. This processed data is then consumed by interfaces such as digital twins, graph solutions, web sites and so on to present the real-time state of a device, or a result of a prediction created by a Machine Learning algorithm.

Let’s first spend two minutes to get a short introduction to what a digital twin really is:

These solutions consists of a large set of systems talking with each other, where each is a technology of its own. To name a few, we have topics like IoT, IoT Edge, Cloud, Artificial Intelligence, Machine Learning, Mixed Reality,  Spatial Anchors, all fields deep enough to have its own specialists.

Getting started as well as understanding how all of this fits together can be hard due to this learning curve. Going from a small IoT device, through the cloud, and rendered on a mixed reality device such as HoloLens involves a lot of technological layers.

The following session from Microsoft Builds 2019 goes through some of these concepts, and explains how the world of IoT and the world of Mixed Reality can be brought together.

Azure Digital Twins + HoloLens: Powering the Next Generation of IoT


Why all the 3D?
3D is a powerful asset when it comes to visualizing data. If we get back to real life, we are used to see and work with items and objects from multiple angels, and get a feel for it. When working with a machine, we know how it looks, and how it feels.

Modern hardware such as mobile devices and PC’s can easily render advanced 3D models and environments, and if designed in the correct way, we can start to create brand new interfaces and ways of communicating with software. Suddenly we can bring the real asset into a digital world, where sensors and data are connected to it, so it renders and behaves much like in the real world. Thus creating a digital shadow of a real asset.

Now, if we apply computation and processing through the cloud, and machine learning algorithms to predict when things needs attention, or can give us a real-time production rate, we can start to both handle situations before they happen, and simulate fictional situations that might happen in the real world.

Posted in Graphics, HoloLens, Mixed Reality | Tagged , , , | Leave a comment

Apollo 11 HoloLens 2 Demo from Microsoft Build 2019

The Microsoft Build Conference just took place in Seattle, and I will be creating a couple of blog posts regarding content from Build 2019, but first, I’d like to share this little demo that was presented.
In this video, Andy and John shares how technology and history can be combined to recreate historic events digitally. The cutting edge of technology in 1969 meets the cutting edge of technology in 2019.

In this demo, the Apollo Saturn V and the Lunar Module is beautifully rendered with an amazing set of details, and interacted with through the HoloLens, digitally recreating the historic lunar landing.

Unreal Engine is about to get native support for HoloLens 2 development by the end of this month:
Creative communities across entertainment, visualization, design, manufacturing, and education eagerly anticipate Unreal Engine 4 native support for HoloLens 2, which Epic has confirmed will be released by the end of May. Originally intended as a stage demo for Microsoft Build, the Unreal Engine team unveiled a remarkable interactive visualization of the Apollo 11 lunar landing, which celebrates its 50th anniversary this year. This is a recording of a live rehearsal taking place on May 5, 2019.


Mixed Reality and Unreal
In the short session below, Ryan Vance from Epic Games and Jackson Fields from Microsoft Mixed Reality talks more about the Unreal Engine support for HoloLens 2 and Windows Mixed Reality headsets. The sessions also goes more into details about the demo itself, and how things were set up.


If you wish to dive deeper into the tech itself, and how to get started making Mixed Reality apps using UE, this session will walk you through everything you need.



Reentry – An Orbital Simulator
The demo presented at Microsoft Build 2019 shows the Apollo rocket from an external perspective, and recreates some if its important stages. I have been working on another project, where I use a game engine to create a simulator, much like Flight Simulator, where you fly and operate these spacecrafts using similar procedures and checklists that was used by the astronauts.
You can read more about this project at


Posted in Graphics, HoloLens, Mixed Reality | Tagged | Leave a comment

Reentry – An Orbital Simulator available on Steam


My space flight simulator Reentry – An Orbital Simulator is available as Early Access through Steam, and can be purchased on the following link:

The game is a realistic space flight simulator based on NASAs space programs; from the first American human spaceflight in Project Mercury, the rendezvous and EVAs of Project Gemini to the Moon landing in Project Apollo.

NASAs early spacecrafts

The Apollo Command Module in Reentry, the spacecraft that flew to the Moon

The simulator puts you in control of NASAs early spacecrafts, where a rich, interactive and functional virtual cockpit modeled and implemented after the official NASA manuals let’s you operate and pilot the spacecraft using similar procedures to what the real astronauts used.

The Gemini Virtual Cockpit panels – considered the bridge to the Moon

Each of the spacecrafts has almost every switch implemented and connected to an underlying system that is used to operate the spacecraft itself. This includes fuses, computers, the electrical system, the environmental control systems, attitude control and so on.

The Mercury Virtual Cockpit panels – NASAs first spaceship

Exceptional views of our oasis in the Universe
The Earth is made out of very high resolution textures, allowing you to see, explore, observe and enjoy the views of Earth from space. Launching into orbit around Earth will let you observe the color change by flying coast to coast over Africa, see mountain ranges, lakes, cities and everything else visible from the orbital altitudes.

A Study Level simulation

The game comes with a set of missions, and mission editors designed to teach you how to operate these highly complex machines. The Academy will take you through the concepts and each spacecraft comes with a lengthy flight manual so you can start studying the spacecrafts.

Purchase and support the development of Reentry!

You can purchase and download the Early Access right now by following this link:


Thank you for your support! 🙂

Posted in Uncategorized | 2 Comments

Project Apollo for the Reentry Space Simulator (UWP)


A quick update from the development of Project Apollo for Reentry. For those of you who are unfamiliar with my project, Reentry is a Windows 10 UWP app that lets you fly the Mercury, Gemini and soon the Apollo spacecraft’s from NASA’s earlier space programs. The purpose is to give you a realistic feeling of how it was to be an astronaut in these machines in full 3D. You are able to follow the real checklists the astronauts used, and study the space crafts using the real manuals created by NASA.

In addition, the simulator comes with an in-game academy, as well as the game manuals found here:

In this post I wish to give you a state if the project, how things fit together and how it looks!


Before visiting Apollo, let me show you what you currently have access to.

The game is in Technical Preview II, and new updates and modules are just submitted as updates. Your installation will stay up-to-date once they roll out.

Technical Preview II gives you access to both the Mercury and the Gemini spacecraft. These are still WIP, but in a state that allows you to perform complex maneuvers in space, as well as follow real checklists and so on. They also comes with some missions you can do, however the mission system is not final.

You can find some videos at

Mercury cockpit

Gemini cockpit

Mercury-Atlas launching


The Apollo Module has been my main goal since I started working on this project, but I wanted to start with the basics and also where NASA started (given the complexity of the entire Apollo program). Mercury and Gemini has been two long projects, a total of 3 years. This has given me some good insights into the how the technology of the space program was developed, astrophysics/orbital mechanics and also how I can deal with some of the mistakes I have made during development.

This section will give you an overview of the current state of the project, from a development perspective.


Both the Mercury and the Gemini module for Reentry was based on my physics engine named GeoGravity to enable Orbital Mechanics around Earth. The first major change for Apollo is that I’m now working with a PhD. in astrophysics on combining my engine with his engine to solve a few things:
1) Stability
2) Going to the Moon
3) More accurate and realistic math

I learned a lot from implementing the two first iterations of my physics engine and how to handle scale and double precision in Unity, and with this new engine, these leanings are all incorporated to give the sim more flexibility and better graphics.

In the screenshot above, the calculations required to fly to the Moon is being tested in an isolated environment/completely different project. Once it’s right I will merge it into the Reentry project and start working on the TLI logic in Apollo.


Project Apollo is far from complete, but a version will soon be released into the Technical Preview II found in Windows Store.

Most of the major components of the panel is complete, including both the model and using it as an interface to the mechanics under the hood.



In the screenshot above you can see that a lot of the switches are already in place. Each is connected logically to internal systems. The missing switches are not yet implemented. What you see is the commanders view (left seat) and the controls for both the primary navigation and control systems, as well as the backup Spacecraft Control System.


The above is the lunar module pilots view, and contains the controls for the Electrical Power System, the Service Propulsion System and the Fuel Cells. It also contains the controls for communication.


The center seat is for the Command Module Pilot, and contains the controls for the computer, the Service Module thrusters, joysticks for both rotational and translational maneuvers, the Cautions and Warnings panel, Mission Timer and the Environmental Control Systems. The hole in the middle is the entry to the Lunar Module itself, once docked with it.

Again, as you can see, most of the switches are not yet in place when it comes to the Environmental Control System and the Communication Systems.


With the exception of a stable guidance computer (LVDC), the launch sequence is working. The computer can run programs and works as the interface between the astronauts and the primary navigation and control system.

(Yes, the texture is wrong on keypad number 7)

The computer does not run the real emulator of the Apollo Command Module Computer (due to licensing), however I’m working on an implementation that replicates a lot of its functionality and behavior. All the programs for prep and launch has been implemented.


The launch sequence kicks in once the countdown reaches zero and ignition is triggered. The rocket ascends into low-Earth parking orbit, where it will orbit Earth until the Trans-Lunar Injection burn.



Apollo comes with two independent systems for controlling the spacecraft. The first one is the Primary Guidance, Navigation and Control System, and the backup is the Spacecraft Control System. Basically, the PGN&C system is controlled automatically and/or through the Command Module Computer, while the SCS is controlled manually through switch-configurations on the panels.

Most of these systems has been implemented with exception of the Flight Director and Attitude Indicator (FDAI). The FDAI is functional, but I have not yet implemented the correct rotation it drives to based on the attitude of the spacecraft relative to a stable platform.



Each of the thrusters/quads can be configured/enabled independently with both circuit breakers and switches in the cockpit.



Apollo contains a lot of electronics. From the Saturn V to the Service Module, to the Command Module. These are controlled through circuit breakers, switches and automatic systems. The spacecraft is only connected to external power while on the launch pad. Once the umbilical disconnects, it runs on internal power sources.

These power sources are both battery powered (backup) and Fuel Cell powered (primary). The Fuel Cells are located in the Service Module and is disconnected before reentry. Both of these power sources are able to generate both DC and AC power. AC is powered through Inverters. You have a lot of control when it comes to the electric system, and it’s one of the most important systems to learn and pay attention to.


The command module comes with a lot of internal and external lights. You are able to configure what lights are powered, and their dim-levels. The panels are illuminated, as well as any digit you see.




As you can see from the screenshots above, the panel can be configured independently. You have three light control panels, so you can create a dark atmosphere, or a bright one, or as in the last screenshot configure one side to be bright and the other to be dark.


The SPS is the main engine, often referenced to as The Engine. It is what will alter your orbits (delta-v) after the Launch Vehicle is separated, and most importantly take you home after being in Lunar Orbit. It’s basically the bell shaped engine on the Service Module.


The engine needs to be gimbaled to keep the thrust balanced based on the center of gravity, as well as balanced when it comes to propulsion. This can be controlled on the panel as well.

Using the control panel to the left, you can control the balance of the oxidizer and the fuel. Also, on panel 1 there is a gimbal panel that enables you to gimbal the SPS on the Pitch and Yaw axis.

The SPS is also re-ignitable, meaning you can ignite it multiple times. Once the fuel and oxidizer levels start to get in the mid-range, the engine needs a forward thrust before ignition to make sure the fuel is in the correct spot inside the fuel tanks. Translational thrusters are used to do this.

The SPS is an important piece of equipment onboard, so it’s good to know how it works. Luckily, it’s mostly automatic. Winking smile


From the cockpit of a virtual Apollo Command Module, I wish to thank you for your time!

If you want to follow the project and updates, feel free to join my Facebook page for the project:

This is the first time I share these details about the new module for Reentry. I hope you found it interesting, and feel free to reach out with any questions! Lastly,  happy international women’s day to everyone out there!


Posted in Game programming | 1 Comment

XNA Shader Programming source now on GitHub

As with the Commodore 64 programming tutorial series, I have now moved all the source from my XNA Shader Programming tutorial series to GitHub.

The XNA Shader Programming series goes through the theory and the HLSL implementation of various effects and concepts. Even though XNA is old, the shaders still look the same, and by following the guides, you should easily be able to implement these in Unity, DirectX, OpenGL and so on.

You can find the repo here, with links to the tutorial articles as well.


Note: If you are at GDC 17, let me know -hope to meet some of my readers there! Smile

Posted in Math, Shaders, XNA, XNA Shader Tutorial | Leave a comment

Commodore 64 Assembly Programming on Windows


In 2011, I wrote a tutorial on how to program for the Commodore 64 on Windows. Today, I revisited the entire tutorial series, made changes and published the source on GitHub.

I know the old posts had a lot of dead links due to changes in how OneDrive was hosting the files, but this problem is now gone – you can find everything related to each tutorial hosted on GitHub. Also, due to some formatting issues that happened when you copied the code listings from the posts, I uploaded the individual code listings using the correct formatting, each with a compiled .prg file so you can easily run and see how it should look when stuck.

Thanks for all the feedback on this tutorial, it’s one of my most read series.

You can find the repo here with links to the individual posts:


Posted in Commodore 64 | 5 Comments

Project Gemini coming soon to my space simulator ReEntry!

Just uploaded the first gameplay video from my space simulator ReEntry! This is still an early preview, but it showcases a lot of functionality.

In the video, you will see the launch from the astronauts perspective. You need to flip switches to power various systems, configure the spacecraft for launch and monitor the instruments during ascent.

Once in Orbit, I use the OnBoard Computer to configure the orbital parameters to enter a circular’ish orbit, before setting it up for rendezvous and burn to reach the target satellite named Agena. Once close to Agena, you can use the radar and the encoder to communicate with it to turn it on, configure lights and so on before docking with it.

All switches are functional, but a lot of polish and tweaking is needed.

The goal of this simulator is to make you learn how the Mercury and Gemini spacecraft’s worked, the technology used to reach orbit and rendezvous. To fly this, you can use the real manuals and checklists provided by NASA, or use the in-game academy.

Hope you will enjoy this little video!

Posted in Game programming, Unity, UWP | 1 Comment