12.9 C
London
Saturday, May 27, 2023
HomeTechProject Starline is the coolest job interview you'll ever take

Project Starline is the coolest job interview you’ll ever take

Date:

I do not have all images from my Project Starline experience. Google had a strict “no photos, no videos” policy. No colleagues either. Just me in a dark conference room on the grounds of the Shoreline Amphitheater in Mountain View. You walk in and sit down at the table. In front of you is what appears to be a large flat screen TV.

A lip below the screen extends in an arc encased in a loudspeaker. There are three camera modules on the edges of the screen – on the top and on either side. They’re a bit like Kinects in that all modern stereoscopic cameras look like that.

The all too short seven minute session is in fact an interview. A soft, blurry figure walks into the frame and sits down, sharpening the focus of the image. It appears to be both a privacy setting and an opportunity for the system to calibrate its subject. One of the main differences between this Project Starline prototype and the one Google showed off late last year is a dramatic reduction in hardware.

The team has reduced the number of cameras from “multiple” to a few and dramatically reduced the overall size of the system from something akin to one of those food booths. The trick here is developing a real-time 3D model of a person with far fewer camera angles. That’s where AI and ML step in and fill in the gaps in data, not entirely unlike the way the Pixel approaches backgrounds with tools like Magic Erase – albeit with a three-dimensional rendering.

After my interviewee – a member of the Project Starline team – shows up, it takes a while for the eyes and brain to adjust. It’s a convincing hologram – especially for a hologram that’s rendered in real time, with about the same kind of lag you’d experience on a plain old two-dimensional zoom call.

You’ll notice something that’s a little… off. People are often the hardest. We have evolved over millennia to identify the slightest deviation from the norm. I’m throwing out the term “twitching” to describe the subtle movement on parts of the subject’s skin. He – more accurately – calls them ‘artifacts’. These are small cases where the system was not quite right, probably due to limitations on the data collected by the onboard sensors. This includes parts with a lack of visual information, which appear as if the artist is short on paint.

Much of your own personal comfort level comes down to adjusting to this new presentation of digital information. When most of us are talking to another person, we are generally not fixated on their bodily form throughout the conversation. You focus on the words and, if you’re attuned to such things, on the subtle physical cues we drop along the way. Presumably, the more you use the system, the less calibration your brain needs.

Quote from a Google research publication about the technology:

Our system realizes important 3D audiovisual cues (stereopsis, motion parallax and spatial audio) and enables the full range of communication cues (eye contact, hand gestures and body language), but does not require special glasses or body-worn microphones/headphones . The system consists of a head-tracked autostereoscopic display, high-resolution 3D capture and rendering subsystems, and network transmission using compressed color and depth video streams. Other contributions include a new image-based geometry fusion algorithm, free-space dereverberation, and speaker localization.

In fact, Project Starline collects information and presents it in such a way as to create the perception of depth (stereopsis), using the two spaced biological cameras in our skulls. Spatial audio, meanwhile, has a similar function for sound, calibrating the speakers to give the impression that the speaker’s voice is coming from their virtual mouth.

Google has been testing this particular prototype version for a while with WeWork, T-Mobile, and Salesforce — presumably the kind of large enterprise customers that would be interested in something like this. The company says a lot of the feedback revolves around how lifelike the experience is versus the likes of Google Meet, Zoom, and Webex — platforms that saved our joint butts during the pandemic but still have many limitations.

You’ve probably heard people complain — or complain yourself — about the things we’ve lost as we went from the face-to-face meeting to virtual. It is an objectively true feeling. Obviously, Project Starline is still a virtual experience, but it could probably trick your brain into believing otherwise. For a work meeting, frankly, that’s probably more than enough.

There is no timeline here and no prizes. Google called it a “technology project” at our meeting. Presumably the ideal outcome for all the time and money spent on such a project is a salable product. The ultimate size and likely price will almost certainly be out of reach for most of us. I saw that a more modular version of the camera system that clips to the side of a TV or computer works well.

For most people, it’s overkill in its current form in most situations, but it’s easy to see how Google could point to the future of teleconferencing. It sure beats your bosses that calls you in an unfinished metaverse.

Jackyhttps://whatsnew2day.com/
The author of what'snew2day.com is dedicated to keeping you up-to-date on the latest news and information.

Latest stories

spot_img