Discussion forum
?CAD discussions, advices, exchange of experience

Please abide by the rules of this forum.
|
Post Reply ![]() |
Author | |
AliveInTheLab ![]() RSS robots ![]() Joined: 20.Nov.2009 Status: Offline Points: 425 |
![]() Posted: 16.Jun.2016 at 04:00 |
This article is very similar to the one I posted on Thursday of last week. The Autodesk Applied Research Lab pursues a broad scope of inquiry — from Advanced Robotics, Internet of Things, and Machine Learning, to Climate Change, and the Future of Work. The team builds real-world prototypes to truly understand how cutting-edge technology will develop in the future, and how these developments will affect the future of Autodesk and the world at large. Virtual reality is an artificial world consisting of images and sounds created by a computer that is affected by the actions of a person experiencing it. Most first-person video games are examples of virtual reality. Augmented reality is an enhanced version of reality created by the use of technology to overlay digital information on an image of something being viewed through a device such as a camera on a smartphone. What separates augmented reality from virtual reality is the inclusion of the real, physical world in the environment being experienced. Team member, Senior Research Engineer, Evan Atherton, filed this report based on his recent work with augmented reality. One of the main goals of Autodesk's Applied Research Lab is to explore new ways of interacting with robots — from the way we plan a robot's motion, to the way we visualize and adapt that motion. Augmented reality is a powerful tool that can fundamentally alter the way we go about the motion-planning process. By using a device like the iPad, or Microsoft HoloLens, we can project digital information about the robot into a physical space. We are currently exploring augmented reality for three main interactions: visualization, augmentation, and adaptation.
Here is a visualization of a virtual robot moving through the actual physical space of the Applied Research Lab. In this case, the colorful printout on the table is actually a marker that is recognized by the software. Future versions will be possible without using the marker. The software can be taught to recognize the shapes of things like the robot itself. Here is that same visualization (which predicted how the robot would move) running alongside the robot actually moving, Technology like this allows people working with robots to see how a robot will behave before the robot performs its actions. This is even possible in the planning stages of a project as this could actually be done even before the project actually gets its robot. By integrating these capabilities with other design tools like Autodesk Dynamo and Fusion 360, we can dramatically alter the current workflow for robotic path planning, allowing designers, artists, and engineers more control in the creation process. Thanks, Evan. Autodesk creates software that allows places, things, and media to be designed, made, and used. Robots are currently integral to the way many things are made. Right now, robots act in isolation. Perhaps with research like this on helping to control robots via augmented reality, robots will work right alongside humans. It won't be a case of "robots took our jobs," but "robots helped us with our jobs" instead. Virtualization is alive in the lab. Go to the original post... |
|
It's Alive in ihe Lab - Autodesk Labs blog by Scott Sheppard
|
|
![]() |
Post Reply ![]() |
|
Tweet
|
Forum Jump | Forum Permissions ![]() You cannot post new topics in this forum You cannot reply to topics in this forum You cannot delete your posts in this forum You cannot edit your posts in this forum You cannot create polls in this forum You cannot vote in polls in this forum |
This page was generated in 0,063 seconds.