What is Extended Reality
Extended reality (XR) is a term that encompasses real and virtual environments that are generated by wearable devices or computer technology to provide an immersive experience. It can also be described as a collection of all immersive technologies that combine real and virtual worlds.
What is Augmented Reality
Augmented reality (AR) is the real-time use of information in the form of text, graphics, audio and other virtual enhancements integrated with real-world objects. It is this "real world" element that differentiates AR from virtual reality.
For example, think of PokƩmon Go, where users are searching in their real-life neighborhoods for animated characters that pop up on their phone or tablet.
What is Virtual Reality
Virtual reality (VR) is a simulated experience that employs pose tracking and 3D near-eye displays to give the user an immersive feel of a virtual world. Applications of virtual reality include entertainment (particularly video games), education (such as medical or military training) and business (such as virtual meetings).
In other words, Virtual Reality (VR) is a computer-generated environment with scenes and objects that appear to be real, making the user feel they are immersed in their surroundings.
What is Mixed Reality
Mixed reality is a combination of both AR & VR, where one can interact with the digital as well as the real world simultaneously. Users can visualize their surroundings in special MR devices. These MR devices are much more powerful than VR, and costly too! But these devices give you the power to interact with the surroundings digitally. For example, putting on an MR device will give you a view of your entire surroundings. You can do whatever you want, throw a ball, close the windows, etc. which will be digitally in your MR headset, but in actual reality, things will remain as they are. Many companies are investing a huge amount of money for deeper research in this field of reality.
In a nutshell, using Extended Reality(XR), people can visit places virtually, feel the same as they are present at that place, and interact with other individuals on XR. Thus, it is a combination of all three AR, VR & MR.
Dev Tools for XR / VR / AR
Unity 3D
Unity's XR platform aims to provide the tools necessary to achieve the core principle of the Unity engine ā "Build once, deploy anywhere" ā for VR and AR projects so that you can target any number of different platforms and devices with a single version of your content.
Using Unity, developers can create immersive XR experiences by building 3D environments and adding interactive elements, such as audio, video, and animations.
Google VR for Everyone
Google offers a wide range of VR development tools, and you can use them to create an immersive VR experience for your stakeholders. You can access these tools on the Google VR developer site.
Blender
Blender is a free and open-source 3D creation suite. It supports the entirety of the 3D pipelineāmodeling, rigging, animation, simulation, rendering, compositing and motion tracking, even video editing and game creation. Advanced users employ Blenderās API for Python scripting to customize the application and write specialized tools; often these are included in Blenderās future releases. Blender is well suited to individuals and small studios who benefit from its unified pipeline and responsive development process.
3ds Max
3ds Max is a popular 3D modeling and rendering software from Autodesk and 3ds Max is used to model, animate, and render detailed 3D characters, photorealistic designs, and complex scenes for film and TV, games, and design visualization projects.
SketchUp Studio
SketchUp Studio is a powerful 3D modeling tool focused on the construction industry and architecture, and you can use it for virtual reality app development. Itās useful for use cases like architecture, commercial interior design, landscape architecture, residential construction, 3D printing, and urban planning.
Unreal Engine
Unreal Engine is a free-to-use game development engine owned by Epic Games. It can create a large variety of 3D, 2D, and VR game styles. The platform is known for its impressive graphical and lighting capabilities. It includes a library of materials, objects, and characters. In addition to supported coding languages, Unreal Engine offers a visual editor, called Blueprints, for creating the rules of the game, which requires no coding experience.
Three.js
Three.js is an open-source JavaScript library that is used to display the graphics, 3D and 2D objects on the web browser. It uses WebGL API behind the scenes. Three.js allows you to use your GPU(Graphics Processing Unit) to render the Graphics and 3D objects on a canvas in the web browser. since we are using JavaScript so we can also interact with other HTML elements.
ARCore
ARCore is Googleās platform for building augmented reality experiences. Using different APIs, ARCore enables your phone to sense its environment, understand the world and interact with information. Some of the APIs are available across Android and iOS to enable shared AR experiences.
echo3D
echo3D is a cloud platform for 3D/AR/VR/Spatial Computing content management and distribution that provides tools and network infrastructure to help developers & companies build and deploy 3D apps, games, and content.
Vuforia
Vuforia is a software development kit (SDK) which is a collection of software tools, libraries, and documentation that enables developers to build applications for specific platforms. It can be used to build augmented reality apps for mobile devices.
Vuforia is not a standalone tool but is rather dependent on other tools like the Unity game engine. Unity is a cross-platform game engine developed by Unity Technologies to facilitate the creation of game development, virtual reality applications, and even train Artificial Intelligence (AI) models.
Spark AR Studio
Spark AR Studio is an augmented reality platform for Mac & Windows that allows you to easily create AR effects for mobile cameras. Think of it like Photoshop or Sketch but for AR. Basically, Spark AR is a Facebook studio tool that allows users to build their own virtual and augmented reality effects.
Application of XR / VR / AR
Retail :
VR technologies help retailers make shopping engaging, improve traditional customer interactions, and thus stand out from competitors. Enterprises initiating retail software development incorporate VR in their projects to build a profound emotional connection with a customer, increasing their loyalty and trust.
Education & Training:
Lots of industries can provide professional training to their employees. But for younger students, AR/VR is part of educational games, field trips, and in general, experiencing the world.
Automotive :
Virtual reality is widely used in the development of smart cars that will flood the market in the future. Cars learn how to drive, turn, and stop using artificial intelligence (AR) and virtual reality.
Space science :
VR enables trainees to go through preparation with minimal risks and even helps astronauts inspect and maintain equipment on the ISS without direct assistance from mission control.
Entertainment and gaming :
AR/VR is being introduced to cinemas and theme parks to simulate movie-like adventures and let people experience their favorite cinematographic masterpieces.
Navigation :
Navigation applications that implement AR use several inputs such as the user's location via GPS, initial camera measurement, and object location. They also track object movement. Sensors collect this information and connect it to IMUs of the objects in motion.
Military :
soldiers suffering from battlefield trauma to overcome these conditions and prepare for new or unexpected situations. AR and VR-backed tactical military training will revolutionize how soldiers process data and instructions to execute operations in combat missions. The military also benefits from reduced training spending and lower soldier mortality rates.
Major challenges of this technologies
VR in education is the lack of effective social interaction and collaboration features. VR can enable learners and educators to interact and collaborate with each other and with virtual characters or agents in various ways. However, VR also poses technical and social challenges, such as latency, bandwidth, synchronization, communication, coordination, and trust. Social interaction and collaboration are essential for learning, especially for developing social and emotional skills, such as empathy, communication, and teamwork. Therefore, educators and designers need to enhance and facilitate social interaction and collaboration in VR for education.
Data collection by AR devices may not be ideal for ensuring personal privacy
AR is vulnerable to threats and unauthorized access that lead to severe consequences
Information overload can cause stress and lead to inaction
Smart glasses can cause eye ailments if quality is compromised
Cost is the most prominent challenge, that is faced by companies developing XR. The XR devices are very costly. Since many technologies are working together & a lot of hardware goes into the making of these devices, the cost is very high. If the cost is higher, the common masses may not be able to use this product and companies developing would not able to increase their sales, this would not motivate the investors to invest their money into XR.
Developing the hardware of XR devices is also a challenge for companies in this field. Since a lot of technologies, software & components are being used, making hardware is a difficult task. The hardware should just not be robust but also be compact and able to process a lot of information very quickly and swiftly, and on top of that, the hardware should be cheaper.
Hardware Tools / Components
VR headset :
For virtual reality to work, there must be an optical system in a head-mounted display that will project an image on a display in front of your eyes.
In this optical system, an HMD includes light sources (display), receivers (eyes), and optical elements (lenses).
The light sources in an HMD are microdisplays, such as organic light emitting diodes (OLED) or liquid crystal displays (LCD). A binocular HMD typically has two displays that provide separate images for each eye and generate 3D perception through stereoscopy. In a holographic HMD, the light source is modulated coherent light from a spatial light modulator (SLM).
The receivers in HMDs are the eyes of the user.
The optical elements collect light from the source and generate renderings of a 3D virtual world. An ideal VR HMD must be able to provide a high-resolution image within a large field of view (FOV) while supporting accommodation cues for 3D perception and have a large eyebox (exit pupil) with a compact form factor
The whole point of VR is to immerse yourself in a new world. However, other than the VR headset, there are also many other parts necessary for VR to actually work. VR headsets like Playstation VR and Oculus Rift are called head-mounted displays, which means that the screen is mounted to your face. Wherever you move your head, the screen follows you.
For certain VR headsets like the HTC Vive and the Oculus Rift, a console or computer is needed for the headsets to work. Video is sent from the console or the computer to the VR headset. For other headsets like the Google Daydream and the Samsung Gear VR, a smartphone has to be slotted into the headset, and the video plays from the phone.
VR headsets either use two LCD displays (one per eye) or two feeds sent to one display. Headsets also have lenses placed between your eyes and the screen, which are used to focus and reshape the picture for each eye. They create a stereoscopic 3D image by angling the two 2D images. This is because the lenses mimic how each of our two eyes sees the world very slightly differently.
VR headsets also need to have a minimum frame rate of at least 60 frames per second in order for the user to not feel sick. Current VR headsets are able to go way beyond this, with Oculus and the HTC Vive at 90 frames per second and PlayStation VR at 120 frames per second.
PlayStation Camera :
The PlayStation Eye is capable of capturing standard video with frame rates of 60 hertz at a 640Ć480 pixel resolution, and 120 hertz at 320Ć240 pixels which is four times the resolution" and "two times the frame rate" of the EyeToy, according to Sony Higher frame rate, up to 320Ć240@187 or 640Ć480@75 fps, can be selected by specific applications (FreeTrack and LinuxTrack).
The PlayStation Eye also has "two times the sensitivity" of the EyeToy with Sony collaborating with sensor chip partner OmniVision Technologies on a sensor chip design using larger sensor pixels, allowing more effective low-light operation. Sony states that the PlayStation Eye can produce "reasonable quality video" under the illumination provided by a television set.
The camera features a two-setting adjustable fixed-focus zoom lens. Selected manually by rotating the lens barrel, the PlayStation Eye can be set to a 56Ā° field of view (red dot) similar to that of the EyeToy, for close-up framing in chat applications, or a 75Ā° field of view (blue dot) for long-shot framing in interactive physical gaming applications.
The PlayStation Eye is capable of outputting video to the console uncompressed, with "no compression artifacts" or with optional JPEG compression. 8 bits per pixel is the sensor's native color depth.
Virtual Omni Platform :
The Virtuix Omni is an omnidirectional treadmill simulator for virtual reality games and other applications. It uses a platform to simulate locomotion i.e. the motion of walking, requiring both special shoes or shoe covers and a surface that reduces friction. It works in conjunction with the HTC Vive and allows a Vive user to physically walk within a limited number of supported games.
In 2013, the Virtuix Omni became one of the ten biggest technology Kickstarter campaigns, raising $1.1 million in funding. Since then, Virtuix has raised another $35 million from private and institutional investors.
PlayStation Move :
The primary component of PlayStation Move, the PlayStation Move motion controller, is a wand controller that allows the user to interact with the console through motion and position in front of a PlayStation camera. It functions similarly to the Wii Remote.
The PlayStation Move motion controller features an orb at the head which can glow in any of a full range of colors using RGB light-emitting diodes (LEDs). Based on the colors in the user environment captured by the camera, the system dynamically selects an orb color that can be distinguished from the rest of the scene. The colored light serves as an active marker, the position of which can be tracked along the image plane by the camera. The uniform spherical shape and known size of the light also allows the system to simply determine the controller's distance from the camera through the light's image size, thus enabling the controller's position to be tracked in three dimensions with high precision and accuracy. The simple sphere-based distance calculation allows the controller to operate with minimal processing latency, as opposed to other camera-based control techniques on the PlayStation 3.
A pair of inertial sensors inside the controller, a three-axis linear accelerometer and a three-axis angular rate sensor, are used to track rotation as well as overall motion. An internal magnetometer is also used for calibrating the controller's orientation against the Earth's magnetic field to help correct against cumulative error (drift) by the inertial sensors. In addition, an internal temperature sensor is used to adjust the inertial sensor readings against temperature effects. The inertial sensors can be used for dead reckoning in cases which the camera tracking is insufficient, such as when the controller is obscured behind the player's back.
The controller face features a large oblong primary button (Move), surrounded by small action buttons, and with a regular-sized PS button beneath, arranged in a similar configuration as on the Blu-ray Disc Remote Control. On the left and right side of the controller is a Select and Start button, respectively. On the underside is an analog trigger (T). On the tail end of the controller is the wrist strap, USB port, and extension port.
The motion controller features vibration-based haptic technology. In addition to providing a tracking reference, the controller's orb light can be used to provide visual feedback, simulating aesthetic effects such as the muzzle flash of a gun or the paint on a brush.
Using different orb colors for each controller, up to four motion controllers can be tracked at once on the PlayStation 3. Demonstrations for the controller have featured activities using a single motion controller, as well as those in which the user wields two motion controllers, with one in each hand. To minimize the cost of entry, Sony stated that all launch titles for PlayStation Move would be playable with one motion controller, with enhanced options available for multiple motion controllers.
On the PlayStation 3, image processing for PlayStation Move is performed in the console's Cell microprocessor. According to Sony, use of the motion-tracking library entails some Synergistic Processing Unit (SPU) overhead as well an impact on memory, though the company states that the effects will be minimized. According to Move motion controller co-designer Anton Mikhailov, the library uses 1-2 megabytes of system memory.
The PlayStation Move navigation controller is a one-handed supplementary controller designed for use in conjunction with the PlayStation Move motion controller for certain types of gameplay, similar to Nintendo Wii Nunchuk, although it lacks motion-sensing technology, as dual-wield, independent two-handed motion control is implemented with the use of another Move Controller. Replicating the major functionality of the left side of a standard PlayStation 3 gamepad, the PlayStation Move navigation controller features a left analog stick (with L3 button function), a D-pad, L1 button and L2 analog trigger.The navigation controller also features and action buttons, as well as a PS button. Since all controls correspond to those of a standard PlayStation 3 gamepad, a Sixaxis or DualShock 3 controller can be used in place of the navigation controller in PlayStation Move applications.
Vive Controller :
Vive have a set of wireless controllers that are used to make you feel like you are controlling what is happening in your VR simulation. There are certain buttons on the controller as well as a lot of sensors to detect gestures such as pointing and waving. Different input methods include voice controls, smart gloves, and even treadmills, which allow you to simulate walking around in a VR environment.
Data Gloves :
A dataglove is an input device that is essentially a glove worn on the hand that contains various electronic sensors that monitor the handās movements and transform them into a form of input for applications such as virtual reality and robotics. Some datagloves enable tactile sensing, allowing the user to seemingly feel a virtual object and to apply fine-motion control.
A dataglove is used to capture physical phenomena, such as the bending of fingers, as data. It also often contains a motion tracker such as an inertial or magnetic tracking device that captures the position and rotation of the hand/glove. These movements are then interpreted by a driver or software made specifically for the glove so that the gestures can be converted into an input for a separate program such as for virtual reality, games or for controlling animatronics or other kinds of robots.
Joystick :
A joystick, sometimes called a flight stick, is an input device consisting of a stick that pivots on a base and reports its angle or direction to the device it is controlling. A joystick, also known as the control column, is the principal control device in the cockpit of many civilian and military aircraft, either as a center stick (or side stick). It has various switches to control the movements of the aircraft controlled by the Pilot and First Officer of the flight.
Joysticks are often used to control video games and usually have one or more one push-buttons whose state can also be read by the computer. A popular variation of the joystick used on modern video game consoles is the analog stick like XBox and PlayStation. Joysticks are also used for controlling machines such as cranes, trucks, underwater unmanned vehicles, wheelchairs, surveillance cameras, and zero-turning radius lawnmowers. Miniature finger-operated joysticks have been adopted as input devices for smaller electronic equipment such as mobile phones.
Reference
š© https://www.youtube.com/watch?v=XLP4YTpUpBI
š© https://www.youtube.com/watch?v=WzfDo2Wpxk
š© https://www.youtube.com/watch?v=04AMaTsXFJU
š© https://www.youtube.com/watch?v=wJzHjk1QT3E
š© https://youtu.be/lbJ-IKPn2l8?si=EquEbPQebWZxrz4F
š© https://youtu.be/XLP4YTpUpBI?si=zXRnwtvRvj4jSQV7
š© https://youtu.be/WzfDo2Wpxks?si=UlXNDrb-LG1v694o
š© https://youtu.be/XLP4YTpUpBI?si=uXhwotm3Qi0DeOzN
š© https://dynamics.microsoft.com/en-in/mixed-reality/guides/what-is-augmented-reality-ar/
That's all for this blog, I hope you will learn something new. And feel free to share your thoughts and feedback, Thanks for reading.
Feel free to reach out to me š
Twitter š±
LinkedIn š±
Github š±