wp-plugin-hostgator
domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init
action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home4/scienrds/scienceandnerds/wp-includes/functions.php on line 6114ol-scrapes
domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init
action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home4/scienrds/scienceandnerds/wp-includes/functions.php on line 6114Source:https:\/\/techcrunch.com\/2023\/05\/11\/project-starline-is-the-coolest-work-call-youll-ever-take\/<\/a><\/br> I don\u2019t have<\/span> any images from my Project Starline<\/a> experience. Google had a strict \u201cno photos, no videos\u201d policy in place. No colleagues, either. Just me in a dark meeting room on the Shoreline Amphitheater grounds in Mountain View. You walk in and sit down in front of a table. In front of you is what looks like a big, flat screen TV.<\/p>\n A lip below the screen extends out in an arc, incased in a speaker. There are three camera modules on the screen\u2019s edges \u2014 on the top and flanking both sides. They look a bit like Kinects in that way all modern stereoscopic cameras seem to.<\/p>\n The all-too-brief seven-minute session is effectively an interview. A soft, blurry figure walks into frame and sits down, as the image\u2019s focus sharpens. It appears to be both a privacy setting and a chance for the system to calibrate its subject. One of the key differences between this Project Starline prototype and the one Google showed off late last year is a dramatic reduction in hardware.<\/p>\n The team has reduced the number of cameras down from \u201cseveral\u201d to a few and dramatically decreased the overall size of the system down from something resembling one of those diner booths. The trick here is developing a real-time 3D model of a person with far fewer camera angles. That\u2019s where AI and ML step in, filling in the gaps in data, not entirely dissimilarly from the way the Pixel approximates backgrounds with tools like Magic Erase \u2014 albeit with a three-dimensional render.<\/p>\n After my interview subject \u2014 a member of the Project Starline team \u2014 appears, it takes a bit of time for the eyes and brain to adjust. It\u2019s a convincing hologram \u2014 especial for one being rendered in real time, with roughly the same sort of lag you would experience on a plain old two-dimensional Zoom call.<\/p>\n You\u2019ll notice something a bit\u2026off. Humans tend to be the most difficult. We\u2019ve evolved over millennia to identify the slightest deviation from the norm. I throw off the term \u201ctwitching\u201d to describe the subtle movement on parts of the subject\u2019s skin. He \u2014 more accurately \u2014 calls them \u201cartifacts.\u201d These are little instances the system didn\u2019t quite nail, likely due to limitations on the data being collected by the on-board sensors. This includes portions with an absence in visual information, which appear as though the artist has run short on paint.<\/p>\n A lot of your own personal comfort level comes down to adjusting to this new presentation of digital information. Generally speaking, when most of us talk to another person, we don\u2019t spend the entire conversation fixated on their corporeal form. You focus on the words and, if you\u2019re attuned to such things, the subtle physical cues we drop along the way. Presumably, the more you use the system, the less calibration your brain requires.<\/p>\n Quoting from a Google research publication<\/a> on the technology:<\/p>\n Our system achieves key 3D audiovisual cues (stereopsis, motion parallax, and spatialized audio) and enables the full range of communication cues (eye contact, hand gestures, and body language), yet does not require special glasses or body-worn microphones\/headphones. The system consists of a head-tracked autostereoscopic display, high-resolution 3D capture and rendering subsystems, and network transmission using compressed color and depth video streams. Other contributions include a novel image-based geometry fusion algorithm, free-space dereverberation, and talker localization.<\/p>\n<\/blockquote>\n Effectively, Project Starline is gathering information and presenting it in such a way that creates the perception of depth (stereopsis), using the two spaced-out biological cameras in our skulls. Spatial audio, meanwhile, serves a similar function for sound, calibrating the speakers to give the impression that the speaker\u2019s voice is coming out of their virtual mouth.<\/p>\n Google has been testing this specific prototype version for some time now with WeWork, T-Mobile and Salesforce \u2014 presumable the sorts of big corporate clients that would be interested in such a thing. The company says much of the feedback revolves around how true to life the experience is versus things like Google Meet, Zoom and Webex \u2014 platforms that saved our collective butts during the pandemic, but still have a good deal of limitations.<\/p>\n You\u2019ve likely heard people complain \u2014 or complained yourself \u2014 about the things we lost as we moved from the in-person meeting to virtual. It\u2019s an objectively true sentiment. Obviously Project Starline is still very much a virtual experience, but can probably trick your brain into believing otherwise. For the sake of a workplace meeting, that\u2019s frankly probably more than enough.<\/p>\n There\u2019s no timeline here and no pricing. Google referred to it as a \u201ctechnology project\u201d during our meeting. Presumably the ideal outcome for all of the time and money spent on such a project is a saleable product. The eventual size and likely pricing will almost certainly be out of reach for most of us. I could see a more modular version of the camera system that clips onto the side of a TV or computer doing well.<\/p>\n For most people in most situations, it\u2019s overkill in its current form, but it\u2019s easy to see how Google could well be pointing to the future of teleconferencing. It certainly beats your bosses making you take calls<\/a> in an unfinished metaverse.<\/p>\n
\nProject Starline is the coolest work call you\u2019ll ever take<\/br>
\n2023-05-11 21:46:10<\/br><\/p>\n\n