<![CDATA[Tyler Gibson: Designing the human experience]]>https://www.thetylergibson.com/https://www.thetylergibson.com/favicon.pngTyler Gibson: Designing the human experiencehttps://www.thetylergibson.com/Ghost 5.110Wed, 05 Mar 2025 00:12:29 GMT60<![CDATA[XReal One Frames]]>
Shipping note: USPS First-class mail does not offer tracking information. Typical transit times are 5-7 business days, but this is not guaranteed. Please select ground, or Priority mail options if you need tracking information!

Features

  • Extremely light weight
    • Reflectors (2.5g - only 1.1g more than stock frames)
    • Flats
]]>
https://www.thetylergibson.com/xreal-one-reflection-covers/67aa4186e87d250001410fcdMon, 10 Feb 2025 18:13:40 GMT
Shipping note: USPS First-class mail does not offer tracking information. Typical transit times are 5-7 business days, but this is not guaranteed. Please select ground, or Priority mail options if you need tracking information!

Features

  • Extremely light weight
    • Reflectors (2.5g - only 1.1g more than stock frames)
    • Flats (0.9g - 0.5g less than stock frames)
  • Dual material (TPU for the reflectors and ASA for the frames)
  • High tolerance (over 70 prototypes to get the fit perfect)
  • Scientifically designed (no loss of peripheral vision)
XReal One Frames

Attaching the XReal One Covers

The XReal One product line has a new feature - replaceable front frames. XReal calls this "Kaleido", and these covers are one of the first products compatible with the Kaleido frame format!

Step 1: Unpacking

XReal One Frames

It took a lot of testing, but one big advantage of TPU is its flexibility. This has allowed me to reduce shipping costs by more than 50%.

When you unpack your lenses, unroll the plastic wrap. The TPU reflectors will be a bit squished. You can use a hair dryer on high to heat the TPU for 15-20 seconds - it will visibly relax back to the original shape. Or, fill a bowl with hot water (not boiling!) and drop the frames in for 30 seconds or so.

Step 2: Remove OEM frame fronts

XReal One Frames

Starting at the top corner, pull the frame away from the housing. This may take some effort - use a spudger-tool or guitar pick if you don't have strong fingernails.

Do this from each top corner, until only the nose bridge area is still connected, then you can pull the nose area up and away pretty easily.

The stock frames are very tight and very firmly held on - it will take some force to remove them.

XReal One Frames
Original frames removed

Step 3: Position Reflector Frames

XReal One Frames

Next take your new reflector frames and position them as shown. You must lift the frames up from the bottom to attach them to the XReal One.

Step 4: Snap Reflector Frames into place

XReal One Frames

Starting from the nose bridge - put the new frames into the grooves on the nose bridge area and work outward to each temple. In the photo, the left temple has already been attached.

You will have to pull slightly to get the frames over the lip of the frame mount ring. Don't worry - these are designed to have some flex - your new frames won't tear or snap apart from this.

These reflector frames are designed to require substantially less force than the stock frames to attach and detach.

Enjoy!

]]>
<![CDATA[Xreal Ultra Reflection Covers]]>
]]>
https://www.thetylergibson.com/xreal-ultra-reflection-covers/67aa411fe87d250001410fb8Mon, 10 Feb 2025 18:12:10 GMT
]]>
<![CDATA[Supermicro SC743 120mm fan wall adapter]]>

This is a drop-in replacement for the stock 80mm fan wall. It uses the same mounting points and screws as the original. It also reuses the stock rubber gaskets and has the same footprint as the original.

Assembly Instructions

  • Original rubber gaskets can be reused in the bottom portion. 
]]>
https://www.thetylergibson.com/supermicro-sc743-120mm-fan-wall-adapter/678c91a8663a78000108e77dWed, 11 Dec 2024 05:48:00 GMT
Supermicro SC743 120mm fan wall adapter

This is a drop-in replacement for the stock 80mm fan wall. It uses the same mounting points and screws as the original. It also reuses the stock rubber gaskets and has the same footprint as the original.

Assembly Instructions

  • Original rubber gaskets can be reused in the bottom portion.  
  • Install base pieces into case first, slot the two pieces together
Supermicro SC743 120mm fan wall adapter
Base pieces slide fit overlapping.
  • Install fans into each section
  • Press fit the three main sections together
Supermicro SC743 120mm fan wall adapter
Support-free press-fits
  • Slide original rubber gaskets into bottom of main assembly
  • Slot main assembly into base
Supermicro SC743 120mm fan wall adapter
Slot into base
  • Screw main assembly into chassis
Supermicro SC743 120mm fan wall adapter
Photo of printed GF-ABS part in gray

Self-print instructions

Print with a high temperature material. Do not use PLA.

Depending on the accuracy of your printer, you may need to ream or tap the case-side holes as appropriate. They are designed to be self-tapped by screwing directly into the holes. As such they are slightly undersized.

Do not overtighten these two screws, they will strip easily.

Print face down to avoid needing any supports.

]]>
<![CDATA[XReal Air / Air 2 / Air 2 Pro Reflection Covers]]>
]]>
https://www.thetylergibson.com/xreal-reflection-covers/678c4a43663a78000108e437Tue, 15 Oct 2024 07:48:16 GMT
]]>
<![CDATA[Designing a weatherproof IoT device]]>There were a lot of challenges in delivering the Purple Martins project, and one of the thorniest was the hardware. Unlike software, there's no continuous integration of new hardware into the wild.

Electronics

Due solely to the time constraints of the project (8 weeks) - we had to

]]>
https://www.thetylergibson.com/designing-a-weatherproof-iot-device/678c4a43663a78000108e433Mon, 07 Oct 2024 15:36:04 GMT

There were a lot of challenges in delivering the Purple Martins project, and one of the thorniest was the hardware. Unlike software, there's no continuous integration of new hardware into the wild.

Electronics

Due solely to the time constraints of the project (8 weeks) - we had to be done-done by Earth Day, April 22nd - I made the choice almost immediately to build on top of the Raspberry Pi platform.

I specifically chose to work with the Raspberry Pi 3B for several reasons:

  • The large ecosystem of supported components with wide availability
  • Stable platform with broad commercial adoption.
  • Microsoft Research had just recently published the Embedded Learning Library with targeted support for the RPi platform.

I briefly looked at other "bare metal" SoC boards from Espressif, Rockchip, OrangePi, Beaglebone, and others. But I knew we would end up spending more time writing firmware for the board than building solutions on top of it.

My hardware lead on the project, Andrew Reitano set to work finding the right mix of off-the-shelf components we could use that would meet our needs.

Designing a weatherproof IoT device
The final hardware mix

We elected for the following configuration:

  • BME280 temperature and humidity sensor
  • VL6180X time of flight sensor
  • Pi-EzConnect shield
  • Noctua NF-A4x10 fan
  • Raspberry Pi Night vision camera (130deg fov, F1.8, 3.6mm focal length)
  • 802.3AF Micro-USB Active POE Splitter
  • Raspberry PI 3B

Environmental protection

I learned pretty quickly that there were going to be extra challenges with running any computer outside in the summer in Florida. I knew we'd need to add active cooling and would need to find a balance between ventilation and environmental isolation - we couldn't conformal coat the electronics, so keeping air moving would be critical to preventing moisture from oxidizing the boards.

I also had to be really careful about thermal isolation - these nests get HOT in the summer, and I had to ensure that any heat generated by the electronics would get exhausted outside and not build up in the nest. We ran a number of tests to measure the thermal load of the compute as well as the thermal load of the IR emitters for night vision.

For the enclosures themselves, after 3d printing, I put them through 5-6 coats of rubberized paint to ensure there could be no water penetration, and to resist any abrasions from wildlife that may take an interest.

Designing a weatherproof IoT device
Rubberizing the shells

Somewhat surprisingly, the devices held up without conformal coating. I chalk a lot of that up to the Pi folks making hardware they know people are going to put into all kinds of weird environments. At the end of our 4 month initial collection period, we'd only had a single Pi that needed to be replaced - and it appeared to be a failure cause by a faulty IR emitter, not weather related at all.

Enclosures

I had to design two enclosures. One for the nest entrance, and one for the interior. Jason provided me a gourd to take home, but I had to reverse engineer a CAD model for it.

Designing a weatherproof IoT device

The internal facing unit was relatively easy - match the screw cap, and place components accordingly. The front mount unit was a bit more difficult. We couldn't put any holes in the nests, everything had to be fully removable and serviceable. Mounting fixed objects to cylindrical surfaces is always a bit of a challenge.

Unlike a home project or PoC, I had to work on the assumption that these units could be in the field for years. As such I had to think about assembly, maintenance, repair, and replacement. The pieces were all designed specifically for 3D printing without any need for support or any advanced materials - so the Disney staff could replicate and replace units at any time.

I also had to create assembly instructions that could be used by my team as well as any future employees.

Gourd enclosure

Designing a weatherproof IoT device

Perch enclosure

Designing a weatherproof IoT device

Assembly

Designing a weatherproof IoT device

Building something once is easy. Building it multiple times gets really hard.

We only had to build 8 units - 4 nests with two compute units per nest. But it took the team a solid two days to get everything up and running the first time around.

There were a lot of moving pieces - on the hardware side, we didn't have test harnesses for each component (nor the time to make them). So we had to take a test-bench style approach - build the electronics and test the whole system.

Fortunately I took the advice of my colleagues Andrew and Bret Stateham, and we over-ordered every part by 50%. And sure enough there were bad modules and bad connectors that had to be swapped out. By the afternoon of assembly day 1, everything was mechanically ready.

Firmware and software lifecycle

I knew early on that we'd have to make sure these devices could be fully remotely managed and maintained.

Our initial plan was to use Microsoft IoT Edge - Essentially running a hypervisor on the Raspberry PI and containerizing everything into a single deployable docker image. Microsoft had recently published some guidance for doing this with the Pi 3B, so we went for it. However the team pretty quickly discovered it wasn't going to work for this project.

IoT Edge worked great for a lot of simple use-cases on the Pi. But the management and hypervisor layer incurs some memory and cpu overhead. In our case, the memory overhead was just a bit too much for us to be able to fit our ELL model on the device. With the Pi4 and beyond having multiple memory configuration options, this is no longer an issue.

So instead we opted for a more traditional, but less robust path of pushing packages to the device through a digital twin configuration in Azure IoT Hub. For pure software updates, this worked nearly perfectly from the start. Stage the code, build, test on the simulated digital twin, deploy to a lab device and test on device, then push to the wild if everything passes.

But we had a bunch of hardware, hardware that had multiple configurations across deployed devices. Hardware / software integration testing sucks. So we did the best we could for the time and resources available.

Designing a weatherproof IoT device
The team hammering on integration testing and verification

We got everyone in a room together and we stood up the entire network of devices end to end. It was painful, it took a couple of days, and the team fixed a TON of issues in a very short amount of time.

This could have easily taken weeks or months of ad-hoc bug filing, testing, and sprint cadence work. Sometimes a good ol' bug bash and hardware bring-up is the right way to go.

Deployment

For our first field test, we were able to run a single unit back to a very "hope facilities doesn't see this before we finish the test" PoE switch and wifi repeater.

Designing a weatherproof IoT device

But for the deployment, we had to have proper power runs. In a cruel twist of fate, one of the few locations of a Purple Martin nest on all of Disney properties didn't have nearby power happened to be right where we needed to install.

Thankfully, Disney is awesome, and the Parks organization trenched a power run right away. This ended up being the largest capital cost of the entire project.

Designing a weatherproof IoT device

Outcomes

Everything worked.

Designing a weatherproof IoT device
Instrumented nests in the wild

We had 4 nests with 8 compute units running for months in the wild, delivering back hundreds of hours of behavioral video, and thousands of data points capturing the exact timeline of the migration, nesting, mating, and fledging lifecycles for the Purple Martins.

Read more about the project here

]]>
<![CDATA[Baby bottle carriers]]>https://www.thetylergibson.com/baby-bottle-carriers/678c4a43663a78000108e435Sat, 05 Oct 2024 08:46:03 GMT
Baby bottle carriers

Am I the only new parent who has ended up with more than a dozen baby bottles to manage? I got tired of having bottles falling all over the place, so I designed these carriers. Friends have found it useful as well, so I've added options for different brands as they come along.


The handle is removable so the caddy can be used with or without the handle. To attach the handle, insert into the top of the caddy, press all the way out the bottom, rotate 90 degrees and pull tight.

These work well for storage and transport. They will fit in most common baby bottle cooler bags.

]]>
<![CDATA[Ikea Symfonisk Gen2 Custom Amp Enclosure]]>

A custom enclosure for modified Ikea Symfonisk Gen 2 speakers being used as amplifiers.

This is printed in glass fiber reinforced ABS and includes two speaker binding posts along with the two-part 3d printed enclosure.  This does not include the Symfonisk PCB or any additional hardware.


This will not

]]>
https://www.thetylergibson.com/ikea-symfonisk-gen2-custom-amp-enclosure/678c4a43663a78000108e434Sat, 05 Oct 2024 05:17:22 GMT
Ikea Symfonisk Gen2 Custom Amp Enclosure

A custom enclosure for modified Ikea Symfonisk Gen 2 speakers being used as amplifiers.

This is printed in glass fiber reinforced ABS and includes two speaker binding posts along with the two-part 3d printed enclosure.  This does not include the Symfonisk PCB or any additional hardware.


This will not work for Gen 1 Symfonisk speakers - find the article here, and the cults3d download link here.

Assembly instructions

All screws need only be tightened until they are snug. Overtightening will strip the screw posts.

Case bottom

  1. Install the banana plugs for the audio output (do not connect them yet).
  2. Install the ethernet port first and fasten into place.
  3. Install the power port and fasten into place.
  4. Install the power box, routing the cable on the power port side of the case as shown.
  5. Connect the speaker wires to the banana plug posts.
Ikea Symfonisk Gen2 Custom Amp Enclosure

Case top

  1. Insert the two Wi-Fi antennas in the top and side of the case - they should fit snugly in place.
  2. Install the rubber button pad into the corresponding holes at the bottom of the case.
  3. Install the button PCB over the pads - take care to match the orientation correctly.
  4. Install the button PCB retaining mount and fasten to case. Take extreme care not to overtighten the screws. They only need a snug fit.
  5. Install the main PCB into the case - top first (farthest from button PCB)
    Angle the PCB down into the case and push the PCB all the way to the top, then lower the other end into place.
    Take care to rout the WiFi antennas so they aren't pinched during installation of the main PCB.
  6. Fasten main PCB in place.
  7. Plug in WiFi antennas, speaker cable, button PCB cable, and power cable.
  8. Close the top clamshell with the bottom and fasten together using original exterior case screws.

Self-print instructions

There are two options to self-print.

  • Front/Back stls - are monolithic parts that will need to be printed with supports on FDM printers.
  • NoSupports_SplitBack stls - has a bottom section that is two pieces and snap fit together to print without supports. If you print this version, I recommend supergluing the parts together to ensure they are permanently bonded.

Tested at 0.2mm and 0.3mm layer heights with 3 perimeters and 1.2mm of top and bottom full layers and 15% infill.

Details

Completely redesigned enclosure for the Symfonisk Gen2. It is more compact than Gen1. All cables exit in the same downward facing direction and within the volume of the body, making cable routing and management in attics and ceilings much more manageable.

]]>
<![CDATA[The story behind the 'Magic of Nature']]>

‘Magic of nature’: Disney’s smart birdhouses reveal the secret lives of purple martins | Transform
Digital transformations and the new face of business

Background - from maintenance to martins

I didn't start with a project pitch to understand a bird species. My

]]>
https://www.thetylergibson.com/the-process-behind-magic-of-nature/678c4a43663a78000108e432Wed, 25 Sep 2024 22:51:04 GMT

‘Magic of nature’: Disney’s smart birdhouses reveal the secret lives of purple martins | Transform
Digital transformations and the new face of business
The story behind the 'Magic of Nature'

Background - from maintenance to martins

I didn't start with a project pitch to understand a bird species. My incredible colleague Molly McCarthy and the broader ISE team had been hard at work rebuilding a trusted relationship with Disney. After some awesome code-with engagement successes, I was asked to fly down to Orlando to discuss interest in using HoloLens in their parks for operations, training, and maintenance.

I spent the afternoon with the Disney executive and learned through our conversation that they were really seeking a way for our team to work with their Emerging Technologies group in DPEP (Disney Parks, Experiences and Products) - and thought AR and HoloLens would be a cool next-gen technology to work together on.

I'd already spent several years knee deep in AR with the 3D Streaming Toolkit, so I was the "HoloLens guy". We walked through Disney's Animal Kingdom to spark our brains and meet employees in-situ, and by the end of the walk I think we had a dozen ideas for incredible projects.

The one we thought was going to be "the" project was an AR experience in the Meerkat habitat. Meerkats are a beloved habitat, but they spend a lot of time underground and out of sight. Using ground sonar and other passive sensors, we imagined being able to see an x-ray style view of the Meerkats in their tunnels to make the guest experience always interactive and engaging.

A return trip to the park unfortunately dropped a bucket of cold water on this plan. The conversationists told us that a lot of the behavior that happens underground is very R-rated as Meerkats can be incredibly violent at times. And their tunnels constantly shift, collapse, and get rerouted, making it almost impossible to instrument sensors that would have a long useful lifespan.

But this was the spark we needed - the folks at Disney had an immediate thought - what about the Purple Martins? They are a species with very active conservation efforts, attuned to humans, and while they had nests all over Disney's parks in Orlando, there'd never been much of a guest-facing experience with them.

The project had its first seed planted.

Opportunity - unmet needs

We met Jason Fischer and Dave MacLean on our first day on the ground to sort out how Microsoft could help. Jason is the purple martin man at Disney.

Jason schooled us on the real process of animal conversation and research with the Purple Martins, and by the end of the first day we had a long list of great ideas to improve the lives of the employees, raise the bar on the understanding of the birds' behavior, and bring all of this incredible science to park guests in some unique ways.

The story behind the 'Magic of Nature'
Jason Fisher showing us the real daily operations for Purple Martin conservation

Conservationists spend an incredible amount of time lowering the nests to the ground, opening each one, and manually marking what they see inside.

  • The precision of the observational data is low, and they are limited in how often they can check the nests and only during the workday.
  • A large number of the distinct phases of mating, hatching, growth, and fledging have to be inferred in between observations.
  • There are many known threats to the purple martins - hawks, owls, starlings, sparrows, and mites to name a few. But there is little direct evidence of how these threats interact with the nest site, and little opportunity for intervention before disaster strikes

Goals and constraints

I defined four goals for the project after our ideation and immersion time with Disney.

Goals

  1. Invert the observation operational model for conservationists
    Enable them to spend the bulk of their time analyzing behavior and patterns, rather than raising and lowering nests.
  2. Help the Emerging Technologies team deliver a guest facing experience
    Almost everything guest facing comes through imagineering at Disney. We wanted to demonstrate that other teams could deliver high quality experiences to guests.
  3. Strengthen the relationship between Microsoft and Disney by delivering an innovative solution together
    Continuing the great work of my colleagues in building trust and showing through action that Microsoft is a great partner and customer enabler.
  4. Demonstrate a scalable long-term solution
    We had to prove to Disney that this could be more than a one-off, limited time project. That we were building a solution that could scale for their business, for guest experiences, and for animal conservation efforts.

Constraints

  • Time
    We began the project in earnest in January. We had to deliver by Earth Day, April 21st, 2018. Any delay would mean the whole project would be cancelled and never see the light of day.
  • Resources
    My team had 7 full time resources from Microsoft and 4 from Disney. We needed to stay within the normal operating budget of about $50k spend. I was extremely fortunate that the team members from both companies were assembled based on the skills we needed to succeed - so we had the right experts on the team from the beginning.
The story behind the 'Magic of Nature'
The Purple Martins projct team
  • Location
    Most of the Microsoft team was in Redmond, while all of the Disney team was in Orlando. We could travel as needed, but had to be very deliberate about making the most of physical time together as a team.
  • Quality
    Disney sets a very high bar for quality - both behind the curtains and in front of guests. This project had to have all of the testing, process, and solution rigor of any commercial offering from either company.
  • Scale
    The project had to be scalable from the start. We all started with the assumption that this would become a service to empower the larger public Purple Martin conversation community.

Architecture - science at the nest

The story behind the 'Magic of Nature'
System architecture for the Purple Martins project

Given the goals and constraints, a number of the architectural decisions fell into place very quickly and early.

The basic idea was to put two small compute modules on each nest, one at the entrance and one peering into the central interior. Connect these two units via PoE to an outdoor switch and push all observational data to a cloud data store. By training and deploying machine vision models to these compute modules, we can record when critical events happen and alert the conversationists when necessary.

The nests will be under continuous monitoring. In the near-term they continue their operational process in a similar manner, just skipping the steps of physically opening each nest.

The story behind the 'Magic of Nature'

Hardware

Raspberry Pi as the IoT platform at the nest is well covered in my article on building a weather resistant IoT device. In short - we wanted this to scale for a large community. To do so we needed to offload as much compute as possible to the edge clients. Lightweight services are cheap and scalable services.

The Jetson board was an easy choice, as it had great IoT edge support, massive compute for the real-time video analysis needed, and was already industrialized for an outdoor deployment.

Data scaling

Keeping compute local was a challenge solved by using the then newly released Embedded Learning Library from Microsoft research. It allowed us to run all the inferencing locally, offloading almost all compute from cloud services. ELL needed every ounce of system memory and CPU/GPU time we could squeeze out of the Pi3B.

This necessitated a deviation from the "template" architecture of IoT Edge, and containers through ACR. I made the choice to keep the Raspberry Pi 3B device management at the bare metal, rather than running through IoT Edge and containers. With the Pi4 and beyond, containers are the way to go.

We adopted the pipeline pattern of IoT hub to Event hub and into CosmosDB from a long list of other successful commercial customer solutions. It was well suited to this project - IoT devices sending a mix of event triggered and continuous aggregated telemetry to be used as a dataset for future analysis.

Quick back of napkin math put our CoGS (cost of goods and services) at about $500/yr ($5 per device per month) for the 4 nests in the project, and at about $7,500/yr to observe every nest on Disney property (~$3.50 per device per month).

This would be okay for Disney, as it would provide rich data - video and images synced to environmental telemetry - but we immediately knew it was unlikely to scale for a large public community for two reasons:

1. The ingest, storage, and processing for the video content would be massive.

The primary purpose for retaining video and image data is to improve model accuracy through continual training. Even during the project I witnessed the "Youtube phenomenon" begin to happen - everyone watched the early recordings but the more that came in, the less we watched.

The story behind the 'Magic of Nature'
YouTube channels, uploads and views: A statistical analysis of the past 10 years

Knowing that keeping all the data wouldn't be useful or desirable, I made sure we made media submission optional both on the client side as well as on the service side - clients can disable media submission, and each registered device on the service end has to be granted media submission rights.

Without sending video the scale math made a lot more sense - coming in at about $12,000/yr to enable 250,000 nests (500,000 Pis) - just under $0.05 per device per month - including buffer to send ~0.5% of events with full video / audio / image data.

  1. Community data science - data science gets expensive with large communities

I also knew that this dataset would be a goldmine for citizen science. Even with the small sample size of our 4-nest observation, we found a treasure trove of insights. Opening PowerBI and Cosmos to the conservationists at Disney was no problem, but enabling this for a massive volunteer community would quickly skyrocket the costs for a centralized approach to analysis.

So we made room for a community API service very early on. A fully public set of REST endpoints that were highly optimized to minimize retrieval costs. Users can't perform complex queries, but they can quickly get meaningful slices of the full dataset, as well as being able to just pull archives of the entire telemetry collection. This was also baked into the cost estimation above.

User experience

The user experience for this project was in many ways the easy* part. Once we had good hard, good pipelines, and good storage working - we a ton of incredibly heartwarming, awe inspiring, and insightful content at our fingertips.

For the conservation team and business stakeholders, I built out a set of PowerBI reports and dashboards enabling them to dig into the lifecycle of these birds with an astonishing level of accuracy, detail, and empathy.

0:00
/0:10

PowerBI dashboard showing labeled machine vision events from real nests

The ML engineer had direct pipelines for supervised learning from the verification and labeling of incoming video streams.

For park guests, we started work on ARKit based iPad applications to tell compelling stories with the mountain of telemetry data that was collected.

'Taking flight' with park guests

The other half of the project was taking this data and putting it together into an educational entertainment experience for park guests on Earth Day 2018. Our colleagues wanted to ensure the work we had done together could be appreciated and enjoyed by park guests.

It had been extremely rare for a non-imagineering group at Disney to put guest facing experiences into the park. There had never been a 3rd party to do so.

We designed and built an ARKit based iPad application we called Taking Flight that brought together the learning from our iOT data in the nests and brought it together in an interactive, augmented reality experience.

0:00
/0:21

A demo of the flight tracking game portion of "Taking Flight"

Outcomes

We learned a ton as a cross-company team with this project.

It was a huge success for the business relationship, helping to continue paving the way for multiple deals over the next several years.

Microsoft and The Walt Disney Studios to develop ‘scene-to-screen’ content workflows - Stories
Companies collaborate to pilot new ways to transform content workflows in the Microsoft Azure cloud; Microsoft becomes a Disney Studios StudioLAB innovation partner REDMOND, Wash., and BURBANK, Calif. — Sept. 13, 2019 — Microsoft Corp. and The Walt Disney Studios today announced a five-year innovation partnership to pilot new ways to create, produce and distribute […]
The story behind the 'Magic of Nature'

This was one of the first commercial deployments of the Embedded Learning Library and we proved the viability of running real-time inferencing on extremely low power IoT devices.

I learned a ton about the realities of managing IoT devices in the wild and the critical importance 100% reliable OTA updating of both system and solution functionality.

I also learned a ton about real world MLOps - wrangling data for training, labeling, and the many ways to continuously improve model accuracy.

We collected several months of continuous behavioral and environmental data on the instrumented nests, making the largest unified behavioral dataset collected for Purple Martins.

Even with a lot of AR experience in HoloLens, architecting and building an ARKit application for the iPad was another great learning experience - data formatting, micro to macro alignment for GPS data, mapping, and calibration to name a few things.

In the end, I lead a small and mighty cross-company team to launch a guest facing experience in Disney's Animal Kingdom on Earth Day 2018.

A rare event for the Emerging Technology team, and even more rare for a non-sponsored vendor to contribute.

]]>
<![CDATA[Beginnings of a platform - 3D Streaming Toolkit]]>Opportunity - unmet needs

In February 2016, we got in contact with AVEVA. They were an early HoloLens solution developer, and they'd hit a wall. In their Everything3D product, their customers develop extremely large, detailed BIM plans for power plants and large marine vessels. The fidelity of these

]]>
https://www.thetylergibson.com/3d-streaming-toolkit-2-2/678c4a43663a78000108e431Wed, 25 Sep 2024 20:43:37 GMTOpportunity - unmet needsBeginnings of a platform - 3D Streaming Toolkit

In February 2016, we got in contact with AVEVA. They were an early HoloLens solution developer, and they'd hit a wall. In their Everything3D product, their customers develop extremely large, detailed BIM plans for power plants and large marine vessels. The fidelity of these models is critical to their applications.

So they posed the question - can we render this remotely from a datacenter to the device?

AVEVA specifically wanted an integrated product experience, not just remote-desktop or remote-app projection in AR.

Challenges

  • Why hasn't anyone else done this yet?
  • Will this make people sick?
  • Do solutions work well on form factors beyond HoloLens?
  • Can this approach scale efficiently?

Before we started, I built an iterative roadmap alongside Chase Laurendine from AVEVA.

The plan

  1. Deep dive with AVEVA to understand their product, technologies, and customer needs.
  2. Come together for a hackfest to attempt to build a functional proof of concept.
  3. If we succeed in a proof of concept, then determine requirements for a commercially viable solution.
  4. Develop an MVP and then test and validate viability with AVEVA.
  5. If successful, publish the toolkit as an open-source project and begin onboarding more customers.

Preparing for the hackfest

We had one week in a room to build a functional prototype. This one of the bedrock tenants of the group I worked with - solving hard problems completely takes a long time, but if you can't show a glimmer of hope in a week or two, there's probably some much larger roadblocks (technological, organizational, regulatory, cultural) that need to be overcome first.

Due to the limited window of opportunity, I started researching. I knew we wouldn't have the time or bandwidth to build an end-to-end prototype from scratch.

I knew we would need three critical components to prove this could work:

  1. A low latency method to capture a frame, screen, or buffer and rehydrate it on the HoloLens
  2. A transport protocol and architecture capable of working across local and wide area networks that can carry multiple data streams (buffers, inputs, and metadata) with good fault tolerance and congestion fallback.
  3. A method to display the content on HoloLens with pose prediction and image stabilization.

After speaking briefly with some colleagues on the HoloLens side of Microsoft, I knew that #3 was possible, but would require a significant amount of organizational effort, so we kicked that can down the road.

Low-latency transcoding

This is not a new problem, and it wasn't in 2016 either. Fortunately, Microsoft had an amazing solution - Remote desktop. I reached out to colleagues who had worked on RDP and learned that it had just received a massive update enabling much higher resolutions, faster framerates, and better image quality.

Unfortunately, the code wasn't accessible - internal only, no published APIs, no integrations, and definitely not open-sourceable.

There were a number of hardware encoding devices 123 on the market that could do "real-time" processing, but none of them were available on any large cloud service provider and all would take weeks to get one on site for even a test.

So next was looking at market solutions - specifically game streaming. Nvidia had recently launched GeForce Now into public beta on their Shield device, and v5.0 of NVEnc (Nvidia video encoder) API added the low latency encoding option in late 2014 - making it possible for anyone to use their hardware for "real-time" video encoding.

We all had nVidia GPUs available with NVEnc cores, and most importantly, both Azure (N-series) and AWS (EC2 GPU) had large deployments of GPU instances available.

ℹ️
I also looked at options for just pushing a raw screen buffer over the network. For HoloLens 1, this worked out to 2560x720@60fps, even at 8bit 4:2:0 - 1.4Gbps, far too much data to scale. And general compression algos either didn't have hardware acceleration or were less efficient than just using h264/hevc.

So I went with NVENC - highest likelihood to find success quickly.

Transport

For the transport protocol, I looked again at several candidates as this is another technology area that was mature at the time.

The first candidate was using existing Microsoft technologies - especially Skype. I had several teammates who had come over from the Skype org, so it was an easy conversation to have. They all pointed me away from this direction pretty quickly. Much like RDP - Skype didn't have a great developer story, and the codebase (especially the infrastructure side) was in a lot of flux as they were consolidating and modernizing with the Teams platform.

Fortunately, multiple colleagues pointed me to the same alternative - WebRTC.

I also looked at WebSockets, and SignalR (layer on top of Websockets) - but neither had any affordances for video streaming - we'd have to build that from ground up.

The biggest drawback at the time for WebRTC was the logistics of building it for WinRT - a requirement for HoloLens client apps. Again fortunately for me (this will become a trend) - I was at the right place and time. I small team elsewhere in Microsoft had been tackling this problem already for a year - and had published WebRTC-UWP, a fork of WebRTC with the intent to merge everything back into Google's codebase when stable.

Thanks to those guys (big shout out to James Cadd, Bernard Aboba, and Robin Raymond) - I was able to get a running demo in a day or two.

So we have a transport protocol, and framework in WebRTC.

Stable holograms

The last piece of the puzzle was frankly the easiest for the hackfest, while being the most difficult long-term problem we faced to a usable solution for customers.

I was able to speak with the HoloLens team who built the Holographic Remoting Player, to seek their help and advice on the larger project of remote rendering we were embarking on.

They gave us two pieces of advice:

  1. Don't bother, it won't work.
  2. If it does work, it will make people sick.

Not the most cheerful advice, but this came from a group who had worked hard on this problem for some time and really knew both the HoloLens and AR technology as a whole much better than any of us. They confirmed that yes, there were APIs for camera pose prediction, and that it would be technically possible to leverage those APIs to provide stable remotely rendered video as a binocular image.

The challenge is that all of these APIs were private and closely guarded IP for Microsoft. Zero chance this was going to be allowed into an open-source project.

With that knowledge in tow, I made the call to punt this problem down the road. If we couldn't succeed with a PoC, the stability wouldn't matter. And if we did succeed, real customers with real money are a great way to put pressure on organizational obstacles.

Proof of Concept

In short, it worked. In 5 days in a room, alongside a couple of engineers from AVEVA, and with the genius help of Jason Fox, we got a demo running from server to headset.

Beginnings of a platform - 3D Streaming Toolkit

This was very, very basic - but we proved that a solution was at least technically possible. It took a little time to put together the proposal, but within a month, I got the green light to go for it.

Building an open-source product

I put together a small and mighty team to tackle this project. We only succeeded due to the unique combination of experience, skills, and perspective of each team member.

Beginnings of a platform - 3D Streaming Toolkit
The core 3D Streaming toolkit team

Andrei Ermilov, Chase Laurendine, Phong Cao, Anastasia Linzbach, Ben Greenier.

Andrei published an amazing retrospective on the full architecture of the V2 toolkit here on the ISE Developer Blog - Real-time streaming of 3D enterprise applications from the cloud to low-powered devices.

The article details all of the major contributions we made back to WebRTC to enable it as a robust transport for real-time interactive experiences, as well as the challenges of getting NVEnc working in a scaled production architecture.

What I will cover more in depth are some aspects of the project that still make it uniquely differentiated from other commercial remote rendering solutions that were extremely important to customer adoption.

Outcomes

Many commercial customers adopted the toolkit to build out their own platforms, including AVEVA, Intel, BMW, Medivis, and Kognitiv Spark.

What was most surprising is how the core technology of this project found legs far beyond the world of HoloLens and spatial compute. There turned out to be many customers who needed ways to take a remote compute resource and project a running application or environment into a highly integrated client experience.

The success of the toolkit was truly realized when the same HoloLens team that sometimes begrudgingly helped us along the way, came back asking about our growth and success. Kudos to them - they always helped us and our customers, even when we disagreed on approach or outcome - one of the things I cherished about working at Microsoft.

When we found success, the door opened. I wrote a business opportunity brief for the organization, and a year or two later, they officially launched Azure Remote Rendering.

Similarly, because we had customers using the toolkit commercially, they all came back to us with the same question - does Microsoft offer WebRTC infrastructure as a service? Initially we pointed them to some great partners like SignalWire. However, we had built up a great relationship with folks in Microsoft Teams and Skype along the way, so I thought it was worth a conversation.

It turned out the timing was right once again. There was willingness to explore opening Skype's infrastructure into an Azure service, and it wouldn't be a huge lift to put a WebRTC capable gateway like Janus in front to make it all work.
Around the same time that ARR launched, so did Azure Communication Services.

]]>
<![CDATA[Small projects]]>https://www.thetylergibson.com/small-projects/66d23d453dea6400010e0985Fri, 30 Aug 2024 21:45:32 GMT<![CDATA[Monoprice Select Mini Guides]]>https://www.thetylergibson.com/monoprice-select-mini-guides/66d23c9e3dea6400010e0967Fri, 30 Aug 2024 21:42:23 GMT<![CDATA[Desktop PC Experiments]]>https://www.thetylergibson.com/desktop-pc-experiments/66d23aea3dea6400010e093cFri, 30 Aug 2024 21:35:04 GMT<![CDATA[Pac-man Arcade Restoration]]>https://www.thetylergibson.com/pac-man-arcade-restoration/66d23a533dea6400010e0924Fri, 30 Aug 2024 21:32:56 GMT<![CDATA[Affordable Laser-cut Teleprompter]]>https://www.thetylergibson.com/affordable-laser-cut-teleprompter/66d239de3dea6400010e0913Fri, 30 Aug 2024 21:30:44 GMT<![CDATA[Ikea Symfonisk Mod Project]]>Perhaps you've heard of Sonos. I have been a fan for a long time, but the prices for their products have always been far out of my financial reach.

Then Ikea started selling the Symfonisk Bookshelf Speaker for $99.00. I bought three. I love

]]>
https://www.thetylergibson.com/ikea-symfonisk-mod-project/66cfb77e11f74a0001806dfbWed, 28 Aug 2024 23:53:09 GMTPerhaps you've heard of Sonos. I have been a fan for a long time, but the prices for their products have always been far out of my financial reach.

Then Ikea started selling the Symfonisk Bookshelf Speaker for $99.00. I bought three. I love them. Brilliant sound, beautiful design, simple interface. Ikea even sells wall mount hardware for them. After living with them for a few months, I realized that in my bedroom and bathroom, I wanted SONOS, but I didn't want a bookshelf speaker taking up the space. Like any other nerdy homeowner, I wanted some cool in ceiling speakers instead.

So it started - modifying the Symfonisk to work as a normal amplifier. This is the series of articles to modify your own Symfonisk speakers.

]]>