You can use Isaac Sim to build a virtual indoor scene, place a wheeled robot inside it, add sensors such as LiDAR and cameras, and then run navigation on that robot. Isaac Sim already provides navigation examples for wheeled robots like Nova Carter, and its robot assets also include platforms such as Create 3 that are suitable for indoor navigation teaching. Isaac Sim scenes are USD-based, so adding objects to the world is fundamentally done by adding new USD primitives or asset references into the stage.
Also, Isaac Sim supports programmatic scene manipulation and object creation at runtime. NVIDIA’s Replicator-related tooling explicitly supports creating and placing objects programmatically, randomizing scenes, and even creating or destroying parts of the simulation pipeline on the fly within the same simulation instance. So for teaching, the answer is not only “possible,” but actually a very good topic because it connects digital twins, perception, mapping, and navigation.
But there is one important engineering idea:
“Dynamically adding unseen objects” can mean two different things.
-
Add the object to the simulator world
You detect or decide that a new chair/box/person should exist, then spawn that object in Isaac Sim during runtime.
-
Update the robot’s navigation understanding of the world
Even if you do not permanently rebuild the global map, the robot can still react to dynamic obstacles through its sensors and local obstacle handling. Isaac Sim’s ROS 2 navigation examples note that dynamic obstacles can be detected by the robot’s sensors, and there is also an example where the robot avoids pallets that were not included in the initial map.
So for your course, I would break the lesson into four layers.
1. Course topic breakdown
Layer A — Build the static indoor world
Students learn how to create or import an indoor environment:
- floor
- walls
- doors
- tables
- shelves
- corridors
- robot start point
- goal points
This is the “known environment.”
Layer B — Put a wheeled robot into the world
Students choose a mobile robot and give it:
- differential drive or other wheeled base
- LiDAR
- RGB / depth camera
- IMU if needed
- ROS 2 bridge if using Nav2
Isaac Sim documentation includes wheeled robot navigation examples and ROS 2 driving workflows for such robots.
Layer C — Make the robot navigate
Students generate a 2D occupancy map from the 3D scene, then use Nav2 to:
- localize
- plan a path
- move to a goal
- avoid sensed obstacles
Isaac Sim’s ROS 2 navigation workflow explicitly includes occupancy-map generation and Nav2 setup.
Layer D — Handle unseen objects
Students learn the difference between:
- static map objects
- temporary dynamic obstacles
- newly discovered semipermanent objects
This is the most interesting part pedagogically, because it teaches that not every unseen object should be treated the same way.
2. A clean teaching sequence
I suggest teaching it in this order:
Phase 1 — Use an existing Isaac Sim example first
Do not begin by building a whole school building or lab from scratch.
Start with a built-in navigation example such as the warehouse / Nova Carter workflow. Students first need to understand:
- what a stage is
- what the robot asset is
- how sensors publish data
- how occupancy maps are generated
- how Nav2 sends a goal
Isaac Sim’s documentation already provides this workflow.
Phase 2 — Replace the environment
After students can run the example:
- remove the example warehouse
- build a simpler custom indoor scene
- regenerate the occupancy map
- test navigation again
Phase 3 — Add runtime object spawning
After the base navigation works:
- add boxes or furniture during simulation
- see whether the robot’s sensors detect them
- compare “local avoidance only” vs “map update”
Phase 4 — Connect to real-world observations
Finally, explain how real robot sensing could drive simulator updates
- real LiDAR/camera detects a new obstacle
- your middleware decides its approximate pose and class
- Isaac Sim spawns a matching object
- navigation stack uses local costmap or updated world model
That is the full digital-twin lesson.
3. Step-by-step practical workflow
Now I will give you the step-by-step version.
Step 0 — Check your Env.
- Check by the Steps: https://ihsumlee.sytes.net/node/33
Step 1 — Prepare the teaching goal
For the first class, define a simple target:
Goal:
A wheeled robot moves from Room A to Room B inside a simple indoor map, while avoiding newly added boxes.
That is enough. Keep it small.
Step 2 — Start from a known mobile-robot example
Use a built-in wheeled robot example first, rather than building the robot from zero.
Good teaching choice:
- Nova Carter navigation example
- or a simpler differential-drive robot like TurtleBot / Create 3 style workflows, depending on what you want to emphasize
Isaac Sim documents ROS 2 navigation with Nova Carter, and also ROS 2 driving workflows for TurtleBot-style robots. It also documents Create 3 as an indoor-navigation-suitable robot asset.
Step 3 — Open or create the indoor environment
You have two main ways:
Option A — Easiest for teaching
Start from an example scene and modify it.
Option B — Build your own simple scene
In Isaac Sim:
- create a ground plane
- create walls with cubes
- create tables/boxes/chairs with primitive shapes or imported assets
- organize them using Xform parents
Isaac Sim’s robot/environment tutorials explain that adding objects to the scene is essentially creating USD prims in the stage.
For teaching, build a “corridor + two rooms” map first.
Step 4 — Insert the robot
Place the wheeled robot at a known start pose:
- facing forward in the corridor
- not intersecting any wall
- with enough free space around it
Then verify:
- wheels are rigged correctly
- robot base frame is sensible
- sensors exist and publish
Step 5 — Add the required sensors
For navigation, the minimum useful setup is:
- LiDAR for obstacle detection
- camera optional
- IMU optional
- odometry / TF chain
In Isaac Sim’s navigation examples, the robot uses sensing pipelines that integrate with ROS 2 / Nav2.
Step 6 — Generate the occupancy map
This is the critical bridge between 3D world and 2D navigation.
Isaac Sim provides an Occupancy Map Generator workflow. In the navigation docs, the map is generated from the warehouse environment and saved for Nav2 use.
Teaching idea:
- show students the 3D scene
- then show the 2D occupancy image
- explain that navigation often plans in 2D even when the simulator is 3D
Step 7 — Launch Nav2
Once the map exists:
- start simulation
- launch ROS 2 Nav2
- verify robot localization
- send a goal from RViz or action graph
Isaac Sim’s ROS 2 navigation tutorial walks through this flow.
Step 8 — Validate baseline behavior
Before adding complexity, verify:
- robot can reach a goal in the static map
- localization is stable
- no strange collisions
- sensor data is consistent
Only after this should students move to unseen-object experiments.
4. How to teach “unseen objects”
Here is the key conceptual split.
Case 1 — Temporary unseen object
Example:
- a person walks by
- a cart passes through
- a box is dropped temporarily
In this case, you usually do not rebuild the global map.
Instead:
- let LiDAR/camera detect it
- local planner / costmap reacts
- robot slows, replans locally, or waits
This is already aligned with Isaac Sim examples that mention detecting dynamic obstacles and avoiding scene obstacles not present in the initial map.
Case 2 — Newly discovered semipermanent object
Example:
- someone placed a cabinet in the hallway
- a new shelf now blocks an area
- a lab table has been moved and stays there
In this case, you may want to:
- spawn/update the object in Isaac Sim
- update the world model
- possibly regenerate or revise the navigation map
This is the “digital twin synchronization” idea.
5. Is dynamic object addition possible in Isaac Sim?
Yes.
There are two practical ways.
Way A — Manual addition during runtime
During the simulation:
- create a box/cylinder/chair asset in the stage
- position it in front of the robot
- enable collision / physics
- continue simulation
Because Isaac Sim scenes are USD stages, objects can be added as prims or asset references.
Way B — Scripted runtime spawning
Use Python scripting to:
- listen for an event
- compute the object pose
- spawn the object into the stage
- attach collision and rigid-body properties
- optionally randomize dimensions or appearance
NVIDIA’s Replicator tooling specifically supports programmatic object creation and placement, and Isaac Sim randomization tutorials show scene changes during runtime.
6. Step-by-step: how to do dynamic addition
Here is the engineering workflow I recommend teaching.
Method 1 — Simplest classroom demo
This is the best first lab.
Step 1
Start the indoor navigation scene.
Step 2
Send the robot toward a goal.
Step 3
While it is moving, manually insert a box into the corridor.
Step 4
Observe:
- does LiDAR see it?
- does the local planner avoid it?
- does the robot stop?
Step 5
Discuss:
- Was the obstacle in the original map?
- If not, why could the robot still react?
This directly teaches local obstacle avoidance.
Method 2 — Scripted spawning
This is the better advanced lab.
Step 1
Prepare a small library of USD assets:
- box
- pallet
- chair
- trash bin
Step 2
Write a Python script that:
- waits for a trigger
- inserts an asset reference into the stage
- sets position, orientation, and scale
- enables collision / rigid body if needed
Step 3
Trigger conditions can be:
- simulation time reaches 20 s
- robot enters a region
- keyboard command
- ROS 2 message
- sensor-detected event from the real robot side
Step 4
After spawning:
- keep the object as a dynamic obstacle
- or mark it as static if you want it to remain
Step 5
Let the navigation stack respond through sensors and costmaps.
This is fully consistent with Isaac Sim’s scene-programming and runtime-randomization capabilities.
7. If the real robot sees an unseen object, how do we reflect it in Isaac Sim?
Yes, this is possible, but you need a pipeline.
Practical pipeline
Step 1 — Perception on the real robot
Real sensors detect a new object:
- LiDAR cluster
- RGB-D detection
- object detector + depth
- SLAM update
Step 2 — Estimate the object state
Compute:
- object type or class
- approximate pose
- size estimate
- confidence
Step 3 — Send the information to Isaac Sim
Use ROS 2 or another middleware to publish:
- object label
- x, y, z
- yaw
- dimensions
- timestamp
Step 4 — Spawn a virtual proxy object in Isaac Sim
A listener script in Isaac Sim receives that message and:
- chooses a matching USD asset
- inserts it into the stage
- places it at the estimated pose
Step 5 — Update how navigation uses it
You have two choices:
- local reaction only: let sensors see the new object
- persistent world update: revise map / occupancy representation if the object remains
This is the right teaching explanation: the simulator becomes a continuously updated virtual copy of the real environment.
8. What students should understand conceptually
This lesson is actually about three models of the world:
Model 1 — Ground-truth simulator world
What truly exists in Isaac Sim.
Model 2 — Robot perceived world
What LiDAR/camera currently sees.
Model 3 — Navigation map
What the planner believes is navigable.
These three are not always identical.
That is why unseen-object handling is such a good robotics topic.
9. A recommended lab design for your course
I would teach it as a 3-lab sequence.
Lab 1 — Static indoor navigation
Students:
- load a wheeled robot
- build a simple indoor scene
- generate occupancy map
- run Nav2
- send goal points
Lab 2 — Dynamic obstacle reaction
Students:
- add runtime boxes/obstacles
- observe LiDAR detection
- compare global map vs local avoidance
- record whether the robot stops, detours, or fails
Lab 3 — Real-to-sim environment update
Students:
- simulate a “real robot detection message”
- publish object pose through ROS 2
- spawn the object in Isaac Sim automatically
- test whether navigation still succeeds
That sequence moves from simple to advanced very naturally.
10. Common misunderstandings to warn students about
Misunderstanding 1
“If the object is not in the original map, the robot must crash.”
Not necessarily.
If the robot has live obstacle sensing and a local planner/costmap, it can still react to newly observed obstacles. Isaac Sim’s navigation examples demonstrate this idea.
Misunderstanding 2
“Dynamic obstacle handling means I must regenerate the whole map every time.”
No.
Only persistent environment changes usually justify map rebuilding. Temporary obstacles are usually handled locally.
Misunderstanding 3
“Spawning an object in Isaac Sim automatically fixes navigation.”
Not by itself.
You also need the sensing/planning side to use that new information correctly.
11. Best first implementation path
For your course, I recommend this exact order:
- Use Isaac Sim’s built-in ROS 2 navigation example first.
- Replace the environment with a very simple custom indoor map.
- Regenerate the occupancy map.
- Make the wheeled robot reach fixed goals.
- Add a runtime box manually.
- Then automate runtime spawning with Python / ROS 2.
- Finally, teach how a real robot’s perception can update the simulator.
12. My honest recommendation
For a first course version, do not begin with full real-to-sim automatic synchronization.
Start with:
- custom indoor scene
- wheeled robot navigation
- one runtime-added obstacle
- local avoidance demonstration
Once that works, extend to:
- ROS 2 message-driven spawning
- persistent map update
- digital twin synchronization