Loan Data - VR

IBM Use Case: AI-Augmented Models in Virtual Reality

Visualize Loan Data Prediction Models with AI-Generated Insights

VRhomepage.png

Problem

Users are having trouble understanding and explaining how their machine learning models are making predictions.

Our solution

Combine Immersive Data and virtual reality to increase explainability. In this case, we want to be able to predict whether or not a customer at a bank will default on their loan. By visualizing the data in 3D, the user can get a better understanding of their predictive models and discover insights faster.

 

My contribution

  • Created designs for integration with IBM’s Augmented Data Explorer application

  • Designed a functional virtual office space

  • Helped create a story from the loan data demonstrating the journey to insights

  • Created 3D model of Emma

Tools

  • Adobe Photoshop

  • Maya

  • Sketch

  • Unity 3D

 

Designing

Integrating into an existing IBM product

Immersive Data is a visualization tool. In order to use it, you need a pipeline for getting the customer’s data. IBM’s ADE, Augmented Data Explorer, allows us to do this and more. Once the data is uploaded, ADE will instantly analyze the data set and surface interesting charts, patterns and insights. I designed the user flows and associated buttons and modals for our integration with ADE.

Modal1.png
 

Understanding the dataset

After selecting a model, the user needs to understand how and why it works. When analyzing such a complicated data set, there are many factors involved. In this case, we have 25 variables to explore. Analyzing these can help us explain the impact of a particular variable on the model output. We organized the variables by utilizing position and a gradient. The yellow dots closest to the front are the variables with the most significant contribution to the model output, whereas those that are green or blue and further away contribute the least.

 

SHAP values tell us how much each variable in the model contributes to the prediction. When looking at the SHAP values in this 2D chart, you can see the impact that one variable has, but it’s really hard to see interaction effects between multiple variables. This is why visualizing this information in 3D is so valuable.

 

Creating the environment

Prior to this, Immersive Data only existed in Augmented Reality. The biggest change in making the shift to VR was curating an intriguing, yet practical environment for users to work in. This workspace really sets the tone for the entire experience. I drew inspiration from some of IBM’s studios and conference rooms around the world. In fact, some of the materials in our virtual environment are photos of the real material from the San Francisco office!

 

Incorporating Emma - The Voice Assistant

Making use of IBM Watson technology puts us at an advantage against our competitors, since a voice assistant is not typically included in data visualization tools. For this project we focused on giving Emma some personality. We created expressions for different states including thinking, speaking, and processing.

 

Final Designs

Below are photos of the final virtual workspace I designed. The intent was simple, sleek and functional. I emulated IBM’s Silicon Valley Lab with the high-tech office juxtaposed amongst the green rolling hills. I also included some iconic pieces of Eames furniture to align the style with IBM’s.

 

Outcome

A demo of this concept was presented at the IBM Data and AI Forum in Miami in October 2019. We received a lot of positive feedback that has encouraged us to push forward. In the meantime, the Immersive Data team is continuing to add to this VR experience to make it even better and all-encompassing. Check out the full video of the demo here.

Image from iOS (2).jpg

In case you missed it above…