BREAKING NEWS

Jio going to launch AR glass

Jio AR Glass: Everything to know about Jio's mixed reality glasses

Mukesh Ambani-owned Reliance Jio Wednesday launched its first smart glasses called Jio Glass. Jio Glass is based on mixed reality and powered by both cellular and wireless networks using the paired phone, much like how the recently-launched Snap Spectacles 3 work. Jio Glass is the marquee product developed by the team of Jio Platforms, which have freshly raised over Rs 33 crore from Google after huge investments from over 13 other firms.

The market for smart devices is still a niche in India, which is why there is a huge scope for companies to launch products that will not only woo customers but pave the way for next-generation technologies. Reliance Jio rose to the occasion and launched its own smart glasses in India. The company is calling them Jio Glass and pitching them as an advanced solution for video conferencing, among other uses. Since Jio Glass will reshape the way people conduct video calls, it is going to be pivotal to Jio's expanding business.


Here are five key things to know about the Jio Glass:

  • Jio Glass seems very much like Google Glass but bulkier and chunkier. But it has a closer analogy with the recently launched Spectacles 3 by the parent company of Snapchat. They look robust and trendy, however, the practicality of the design needs to be seen after the smart glasses are available for use. Jio Glass has a thick temple covered with plastic. There are several buttons on it for different functionalities.
  • There is no information about the processor used on the Jio Glass but the company has said its smart glasses will be able to run as many as 25 apps while supporting spatial and directional XR audio on its speakers. There will also be two microphones on the glass that will help users while video conferencing. There is a camera in the centre of two frames, right above the bridge, which will click photos and upload them in real-time to any video conference, plus the images can be saved to the paired smartphone.
  • Jio Glass will use mixed reality for video conferences and remote communication with people. It is driven by Augmented Reality and Virtual Reality for things such as a 3D holographic image of participants, in addition to their 2D avatar from the regular video feed. Every avatar will appear right in front of the user's eyes in a virtually created environment, such as that of an office or a conference room. The participants will be able to share files and make presentations (in both 2D and 3D formats) while being virtually available in Jio Glass' environment.
  • Jio Glass will also be used for virtual classrooms to teach students online. Since the pandemic has forced schools to remain shut country-wide, the education boards and ed-tech companies are bringing solutions to make teaching easier. Jio Glass will offer virtual demonstrations, such as a virtual tour of geographical locations, and online marking of tests in the classroom, which can have as many students as the host wants. Jio is launching its own online education platform called Embibe, which will be integrated with Jio Glass.
  • Jio Glass currently supports 25 apps meant for video conferencing and online collaboration. It also supports voice commands for most functions, minimising the need for buttons. For example, to make a video call to a person or more, simply say "Hello Jio, call [person 1] and [person 2]", and so on. Jio Glass will make a call to those people, given it is tethered to a phone using a cable or Wi-Fi.

Reliance Jio has not said much about Jio Glass, including its specifications, other features. The price and availability of the Jio Glass have also not been divulged.


Vuzix Blade


Augmented reality (AR) is an interactive experience of a real-world environment where the objects that reside in the real world are enhanced by computer-generated perceptual information, sometimes across multiple sensory modalities, including visual, auditory, haptic, somatosensory and olfactory.[1][2] AR can be defined as a system that fulfills three basic features: a combination of real and virtual worlds, real-time interaction, and accurate 3D registration of virtual and real objects.[3] The overlaid sensory information can be constructive (i.e. additive to the natural environment), or destructive (i.e. masking of the natural environment).[4] This experience is seamlessly interwoven with the physical world such that it is perceived as an immersive aspect of the real environment.[4] In this way, augmented reality alters one's ongoing perception of a real-world environment, whereas virtual reality completely replaces the user's real-world environment with a simulated one.[5][6] Augmented reality is related to two largely synonymous terms: mixed reality and computer-mediated reality.

The primary value of augmented reality is the manner in which components of the digital world blend into a person's perception of the real world, not as a simple display of data, but through the integration of immersive sensations, which are perceived as natural parts of an environment. The earliest functional AR systems that provided immersive mixed reality experiences for users were invented in the early 1990s, starting with the Virtual Fixtures system developed at the U.S. Air Force's Armstrong Laboratory in 1992.[4][7][8] Commercial augmented reality experiences were first introduced in entertainment and gaming businesses. Subsequently, augmented reality applications have spanned commercial industries such as education, communications, medicine, and entertainment. In education, content may be accessed by scanning or viewing an image with a mobile device or by using markerless AR techniques.[9][10]

Augmented reality is used to enhance natural environments or situations and offer perceptually enriched experiences. With the help of advanced AR technologies (e.g. adding computer vision, incorporating AR cameras into smartphone applications and object recognition) the information about the surrounding real world of the user becomes interactive and digitally manipulated. Information about the environment and its objects is overlaid on the real world. This information can be virtual[11][12][13][14] or real, e.g. seeing other real sensed or measured information such as electromagnetic radio waves overlaid in exact alignment with where they actually are in space.[15][16][17] Augmented reality also has a lot of potential in the gathering and sharing of tacit knowledge. Augmentation techniques are typically performed in real time and in semantic contexts with environmental elements. Immersive perceptual information is sometimes combined with supplemental information like scores over a live video feed of a sporting event. This combines the benefits of both augmented reality technology and heads up display technology (HUD).

The difference between Virtual reality and augmented reality

In virtual reality (VR), the users' perception of reality is completely based on virtual information. In augmented reality (AR) the user is provided with additional computer generated information that enhances their perception of reality.[18][19] For example, in architecture, VR can be used to create a walk-through simulation of the inside of a new building; and AR can be used to show a building's structures and systems super-imposed on a real-life view. Another example is through the use of utility applications. Some AR applications, such as Augment, enable users to apply digital objects into real environments, allowing businesses to use augmented reality devices as a way to preview their products in the real world.[20] Similarly, it can also be used to demo what products may look like in an environment for customers, as demonstrated by companies such as Mountain Equipment Co-op or Lowe's who use augmented reality to allow customers to preview what their products might look like at home through the use of 3D models.[21]

Augmented reality (AR) differs from virtual reality (VR) in the sense that in AR part of the surrounding environment is actually 'real' and just adding layers of virtual objects to the real environment. On the other hand, in VR the surrounding environment is completely virtual. A demonstration of how AR layers objects onto the real world can be seen with augmented reality games. WallaMe is an augmented reality game application that allows users to hide messages in real environments, utilizing geolocation technology in order to enable users to hide messages wherever they may wish in the world.[22] Such applications have many uses in the world, including in activism and artistic expression.[23]

Virtual retinal display

A virtual retinal display (VRD) is a personal display device under development at the University of Washington's Human Interface Technology Laboratory under Dr. Thomas A. Furness III.[54] With this technology, a display is scanned directly onto the retina of a viewer's eye. This results in bright images with high resolution and high contrast. The viewer sees what appears to be a conventional display floating in space.[55]

Several of tests were done to analyze the safety of the VRD.[54] In one test, patients with partial loss of vision—having either macular degeneration (a disease that degenerates the retina) or keratoconus—were selected to view images using the technology. In the macular degeneration group, five out of eight subjects preferred the VRD images to the cathode-ray tube (CRT) or paper images and thought they were better and brighter and were able to see equal or better resolution levels. The Keratoconus patients could all resolve smaller lines in several line tests using the VRD as opposed to their own correction. They also found the VRD images to be easier to view and sharper. As a result of these several tests, virtual retinal display is considered safe technology.

Virtual retinal display creates images that can be seen in ambient daylight and ambient room light. The VRD is considered a preferred candidate to use in a surgical display due to its combination of high resolution and high contrast and brightness. Additional tests show high potential for VRD to be used as a display technology for patients that have low vision.

Solos

Commerce

Illustration of an AR-Icon image
The AR-Icon can be used as a marker on print as well as on online media. It signals the viewer that digital content is behind it. The content can be viewed with a smartphone or tablet

AR is used to integrate print and video marketing. Printed marketing material can be designed with certain "trigger" images that, when scanned by an AR-enabled device using image recognition, activate a video version of the promotional material. A major difference between augmented reality and straightforward image recognition is that one can overlay multiple media at the same time in the view screen, such as social media share buttons, the in-page video even audio and 3D objects. Traditional print-only publications are using augmented reality to connect different types of media.[124][125][126][127][128]

AR can enhance product previews such as allowing a customer to view what's inside a product's packaging without opening it.[129] AR can also be used as an aid in selecting products from a catalog or through a kiosk. Scanned images of products can activate views of additional content such as customization options and additional images of the product in its use.[130]

By 2010, virtual dressing rooms had been developed for e-commerce.[131]

In 2012, a mint used AR techniques to market a commemorative coin for Aruba. The coin itself was used as an AR trigger, and when held in front of an AR-enabled device it revealed additional objects and layers of information that were not visible without the device.[132][133]

In 2018, Apple announced USDZ AR file support for iPhones and iPads with iOS12. Apple has created an AR QuickLook Gallery that allows masses to experience augmented reality on their own Apple device.[134]

In 2018, Shopify, the Canadian e-commerce company, announced ARkit2 integration. Their merchants are able to use the tools to upload 3D models of their products. Users will be able to tap on the goods inside Safari to view in their real-world environments.[135]

In 2018, Twinkl released a free AR classroom application. Pupils can see how York looked over 1,900 years ago.[136] Twinkl launched the first ever multi-player AR game, Little Red[137] and has over 100 free AR educational models.[138]

Augmented reality is becoming more frequently used for online advertising. Retailers offer the ability to upload a picture on their website and "try on" various clothes which are overlaid on the picture. Even further, companies such as Bodymetrics install dressing booths in department stores that offer full-body scanning. These booths render a 3-D model of the user, allowing the consumers to view different outfits on themselves without the need of physically changing clothes.[139] For example, JC Penney and Bloomingdale's use "virtual dressing rooms" that allow customers to see themselves in clothes without trying them on.[140] Another store that uses AR to market clothing to its customers is Neiman Marcus.[141] Neiman Marcus offers consumers the ability to see their outfits in a 360-degree view with their "memory mirror".[141] Makeup stores like L'Oreal, Sephora, Charlotte Tilbury, and Rimmel also have apps that utilize AR.[142] These apps allow consumers to see how the makeup will look on them.[142] According to Greg Jones, director of AR and VR at Google, augmented reality is going to "reconnect physical and digital retail".[142]

AR technology is also used by furniture retailers such as IKEA, Houzz, and Wayfair.[142][140] These retailers offer apps that allow consumers to view their products in their home prior to purchasing anything.[142] In 2017, Ikea announced the Ikea Place app. It contains a catalogue of over 2,000 products—nearly the company's full collection of sofas, armchairs, coffee tables, and storage units which one can place anywhere in a room with their phone.[143] The app made it possible to have 3D and true-to-scale models of furniture in the customer's living space. IKEA realized that their customers are not shopping in stores as often or making direct purchases anymore.[144][145]

Everysight Raptor

The dangers of AR

Reality modifications

In a paper titled "Death by Pokémon GO”, researchers at Purdue University's Krannert School of Management claim the game caused "a disproportionate increase in vehicular crashes and associated vehicular damage, personal injuries, and fatalities in the vicinity of locations, called PokéStops, where users can play the game while driving."[252] Using data from one municipality, the paper extrapolates what that might mean nationwide and concluded "the increase in crashes attributable to the introduction of Pokémon GO is 145,632 with an associated increase in the number of injuries of 29,370 and an associated increase in the number of fatalities of 256 over the period of July 6, 2016, through November 30, 2016." The authors extrapolated the cost of those crashes and fatalities at between $2bn and $7.3 billion for the same period. Furthermore, more than one in three surveyed advanced Internet users would like to edit out disturbing elements around them, such as garbage or graffiti.[253] They would like to even modify their surroundings by erasing street signs, billboard ads, and uninteresting shopping windows. So it seems that AR is as much a threat to companies as it is an opportunity. Although, this could be a nightmare to numerous brands that do not manage to capture consumer imaginations it also creates the risk that the wearers of augmented reality glasses may become unaware of surrounding dangers. Consumers want to use augmented reality glasses to change their surroundings into something that reflects their own personal opinions. Around two in five want to change the way their surroundings look and even how people appear to them.

Next, to the possible privacy issues that are described below, overload and over-reliance issues are the biggest danger of AR. For the development of new AR-related products, this implies that the user-interface should follow certain guidelines as not to overload the user with information while also preventing the user from over-relying on the AR system such that important cues from the environment are missed.[254] This is called the virtually-augmented key.[254] Once the key is ignored, people might not desire the real world anymore.

Privacy concerns

The concept of modern augmented reality depends on the ability of the device to record and analyze the environment in real time. Because of this, there are potential legal concerns over privacy. While the First Amendment to the United States Constitution allows for such recording in the name of public interest, the constant recording of an AR device makes it difficult to do so without also recording outside of the public domain. Legal complications would be found in areas where a right to a certain amount of privacy is expected or where copyrighted media are displayed.

In terms of individual privacy, there exists the ease of access to information that one should not readily possess about a given person. This is accomplished through facial recognition technology. Assuming that AR automatically passes information about persons that the user sees, there could be anything seen from social media, criminal record, and marital status.[255]

The Code of Ethics on Human Augmentation, which was originally introduced by Steve Mann in 2004 and further refined with Ray Kurzweil and Marvin Minsky in 2013, was ultimately ratified at the Virtual Reality Toronto conference on June 25, 2017.


    History

  • 1901: L. Frank Baum, an author, first mentions the idea of an electronic display/spectacles that overlays data onto real life (in this case 'people'). It is named a 'character marker'.[275]
  • 1957–62: Morton Heilig, a cinematographer, creates and patents a simulator called Sensorama with visuals, sound, vibration, and smell.[276]
  • 1968: Ivan Sutherland invents the head-mounted display and positions it as a window into a virtual world.[277]
  • 1975: Myron Krueger creates Videoplace to allow users to interact with virtual objects.
  • 1980: The research by Gavan Lintern of the University of Illinois is the first published work to show the value of a heads up display for teaching real-world flight skills.[193]
  • 1980: Steve Mann creates the first wearable computer, a computer vision system with text and graphical overlays on a photographically mediated scene.[278] See EyeTap. See Heads Up Display.
  • 1981: Dan Reitan geospatially maps multiple weather radar images and space-based and studio cameras to earth maps and abstract symbols for television weather broadcasts, bringing a precursor concept to augmented reality (mixed real/graphical images) to TV.[279]
  • 1986: Within IBM, Ron Feigenblatt describes the most widely experienced form of AR today (viz. "magic window," e.g. smartphone-based Pokémon Go), use of a small, "smart" flat panel display positioned and oriented by hand.[280]
  • 1987: Douglas George and Robert Morris create a working prototype of an astronomical telescope-based "heads-up display" system (a precursor concept to augmented reality) which superimposed in the telescope eyepiece, over the actual sky images, multi-intensity star, and celestial body images, and other relevant information.[281]
  • 1990: The term 'Augmented Reality' is attributed to Thomas P. Caudell, a former Boeing researcher.[282]
  • 1992: Louis Rosenberg developed one of the first functioning AR systems, called Virtual Fixtures, at the United States Air Force Research Laboratory—Armstrong, that demonstrated benefit to human perception.[283]
  • 1992: Steven Feiner, Blair MacIntyre and Doree Seligmann present an early paper on an AR system prototype, KARMA, at the Graphics Interface conference.
  • 1993: CMOS active-pixel sensor, a type of metal–oxide–semiconductor (MOS) image sensor, developed at NASA's Jet Propulsion Laboratory.[284] CMOS sensors are later widely used for optical tracking in AR technology.[285]
  • 1993: Mike Abernathy, et al., report the first use of augmented reality in identifying space debris using Rockwell WorldView by overlaying satellite geographic trajectories on live telescope video.[195]
  • 1993: A widely cited version of the paper above is published in Communications of the ACM – Special issue on computer augmented environments, edited by Pierre Wellner, Wendy Mackay, and Rich Gold.[286]
  • 1993: Loral WDL, with sponsorship from STRICOM, performed the first demonstration combining live AR-equipped vehicles and manned simulators. Unpublished paper, J. Barrilleaux, "Experiences and Observations in Applying Augmented Reality to Live Training", 1999.[287]
  • 1994: Julie Martin creates first 'Augmented Reality Theater production', Dancing in Cyberspace, funded by the Australia Council for the Arts, features dancers and acrobats manipulating body–sized virtual object in real time, projected into the same physical space and performance plane. The acrobats appeared immersed within the virtual object and environments. The installation used Silicon Graphics computers and Polhemus sensing system.
  • 1995: S. Ravela et al. at University of Massachusetts introduce a vision-based system using monocular cameras to track objects (engine blocks) across views for augmented reality.
  • 1998: Spatial Augmented Reality introduced at University of North Carolina at Chapel Hill by Ramesh Raskar, Welch, Henry Fuchs.[62]
  • 1999: Frank Delgado, Mike Abernathy et al. report successful flight test of LandForm software video map overlay from a helicopter at Army Yuma Proving Ground overlaying video with runways, taxiways, roads and road names.[200][201]
  • 1999: The US Naval Research Laboratory engages on a decade-long research program called the Battlefield Augmented Reality System (BARS) to prototype some of the early wearable systems for dismounted soldier operating in urban environment for situation awareness and training.[288]
  • 1999: NASA X-38 flown using LandForm software video map overlays at Dryden Flight Research Center.[289]
  • 2000: Rockwell International Science Center demonstrates tetherless wearable augmented reality systems receiving analog video and 3-D Audio over radio-frequency wireless channels. The systems incorporate outdoor navigation capabilities, with digital horizon silhouettes from a terrain database overlain in real time on the live outdoor scene, allowing visualization of terrain made invisible by clouds and fog.[290][291]
  • 2004: Outdoor helmet-mounted AR system demonstrated by Trimble Navigation and the Human Interface Technology Laboratory (HIT lab).[102]
  • 2008: Wikitude AR Travel Guide launches on 20 Oct 2008 with the G1 Android phone.[292]
  • 2009: ARToolkit was ported to Adobe Flash (FLARToolkit) by Saqoosha, bringing augmented reality to the web browser.[293]
  • 2010: Design of mine detection robot for Korean mine field.[196]
  • 2012: Launch of Lyteshot, an interactive AR gaming platform that utilizes smart glasses for game data
  • 2013: Meta announces the Meta 1 developer kit.[294][295]
  • 2015: Microsoft announces Windows Holographic and the HoloLens augmented reality headset. The headset utilizes various sensors and a processing unit to blend high definition "holograms" with the real world.[296]
  • 2016: Niantic released Pokémon Go for iOS and Android in July 2016. The game quickly became one of the most popular smartphone applications and in turn spikes the popularity of augmented reality games.[297]
  • 2017: Magic Leap announces the use of Digital Lightfield technology embedded into the Magic Leap One headset. The creators edition headset includes the glasses and a computing pack worn on your belt.[298]
  • 2019: Microsoft announces HoloLens 2 with significant improvements in terms of field of view and ergonomics.



Share this:

Post a Comment

If you have any doubts, please let me know

 
Copyright © 2014 commerce world news. Designed by OddThemes