Chapter 14. Synthetic Shopping
You’ve had some initial exposure to using Unity to generate a custom synthetic dataset, but you’ve only scratched the surface.
In this chapter, we’ll combine what we’ve learned so far to further explore the possibilities and features of Unity Perception, and talk about how you can apply them to your own projects.
Specifically, we’re going to create a full-featured set of synthetic data using Unity and Perception: a set of items that one might find at a supermarket, nicely annotated and tagged.
Imagine an AI-powered shopping trolley that knows what items you’re touching as you take them from the shelves (you don’t have to stretch far to imagine it, since it’s real!). In order to train such a thing, you’d need a large corpus of data, showing packages of products you’d find in a supermarket. You’d need the images of packages at a huge variety of angles, with a variety of things behind them, and you’d need them tagged so that when you train the model using them, you’d be able to accurately train it.
We’re going to make that dataset in this chapter.
Creating the Unity Environment
First, we need to build the world inside Unity that will create our randomized shop images. The world in this case will be a scene that we add randomizers to, in order to create the range of images we need.
To get the Unity environment up and running, follow these steps:
-
Create a brand-new Unity project, selecting the Universal Render Pipeline (URP) template again, as shown ...
Get Practical Simulations for Machine Learning now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.