Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the wp-plugin-hostgator domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home4/scienrds/scienceandnerds/wp-includes/functions.php on line 6114

Notice: Function _load_textdomain_just_in_time was called incorrectly. Translation loading for the ol-scrapes domain was triggered too early. This is usually an indicator for some code in the plugin or theme running too early. Translations should be loaded at the init action or later. Please see Debugging in WordPress for more information. (This message was added in version 6.7.0.) in /home4/scienrds/scienceandnerds/wp-includes/functions.php on line 6114

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893

Warning: Cannot modify header information - headers already sent by (output started at /home4/scienrds/scienceandnerds/wp-includes/functions.php:6114) in /home4/scienrds/scienceandnerds/wp-includes/rest-api/class-wp-rest-server.php on line 1893
{"id":3138,"date":"2022-04-07T14:41:59","date_gmt":"2022-04-07T14:41:59","guid":{"rendered":"https:\/\/scienceandnerds.com\/2022\/04\/07\/google-now-lets-you-search-for-things-you-cant-describe-by-starting-with-a-picture\/"},"modified":"2022-04-07T14:42:01","modified_gmt":"2022-04-07T14:42:01","slug":"google-now-lets-you-search-for-things-you-cant-describe-by-starting-with-a-picture","status":"publish","type":"post","link":"https:\/\/scienceandnerds.com\/2022\/04\/07\/google-now-lets-you-search-for-things-you-cant-describe-by-starting-with-a-picture\/","title":{"rendered":"Google now lets you search for things you can\u2019t describe \u2014 by starting with a picture"},"content":{"rendered":"

Source: https:\/\/www.theverge.com\/2022\/4\/7\/23014141\/google-lens-multisearch-android-ios<\/a>
\n
<\/br><\/code><\/p>\n

\n

You like the way that dress looks but you\u2019d rather have it in green. You want those shoes but prefer flats to heels. What if you could have drapes with the same pattern as your favorite notebook? I don\u2019t know how to Google for these things, but Google Search product manager Belinda Zeng showed me real-world examples of each earlier this week, and the answer was always the same: take a picture, then type a single word into Google Lens. <\/p>\n

Today, Google is launching a US-only beta of the Google Lens Multisearch feature it teased last September<\/a> at its Search On event<\/a>, and while I\u2019ve only seen a rough demo so far, you shouldn\u2019t have to wait long to try it for yourself: it\u2019s rolling out in the Google app on iOS<\/a> and Android<\/a>. <\/p>\n

\n <\/p>\n

<\/p>\n

\"\"<\/p>\n

<\/span><\/p>\n

<\/span><\/p>\n

Take a screenshot or picture of a dress, then tap, type \u201cgreen,\u201d and search for a similar one in a different color.<\/em><\/figcaption>GIF: Google<\/cite><\/p>\n

<\/span><\/p>\n<\/figure>\n

While it\u2019s mostly aimed at shopping to start \u2014 it was one of the most common requests \u2014 Google\u2019s Zeng and the company\u2019s search director Lou Wang suggest it could do a lot more than that. \u201cYou could imagine you have something broken in front of you, don\u2019t have the words to describe it, but you want to fix it… you can just type \u2018how to fix,\u2019\u201d says Wang. <\/p>\n

In fact, it might already work with some broken bicycles, Zeng adds. She says she also learned about styling nails by screenshotting pictures of beautiful nails on Instagram, then typing the keyword \u201ctutorial\u201d to get the kind of video results that weren\u2019t automatically coming up on social media. You may also be able to take a picture of, say, a rosemary plant, and get instructions on how to care for it. <\/p>\n

\n <\/p>\n

<\/p>\n

\"\"<\/p>\n

<\/span><\/p>\n

<\/span><\/p>\n

Google\u2019s Belinda Zeng showed me a live demo where she found drapes to match a leafy notebook.<\/em><\/figcaption>GIF by Sean Hollister \/ The Verge<\/cite><\/p>\n

<\/span><\/p>\n<\/figure>\n

\u201cWe want to help people understand questions naturally,\u201d says Wang, explaining how multisearch will expand to more videos, images in general, and even the kinds of answers you might find in a traditional Google text search. <\/p>\n

It sounds like the intent is to put everyone on even footing, too: rather than partnering with specific shops or even limiting video results to Google-owned YouTube, Wang says it\u2019ll surface results from \u201cany platform we\u2019re able to index from the open web.\u201d<\/p>\n

\n <\/p>\n

<\/p>\n\"\"<\/p>\n

<\/source><\/picture>\n

<\/span><\/p>\n

<\/span><\/p>\n

When Zeng took a picture of the wall behind her, Google came up with ties that had a similar pattern.<\/em><\/figcaption>Screenshot by Sean Hollister \/ The Verge<\/cite><\/p>\n

<\/span><\/p>\n<\/figure>\n

But it won\u2019t work with everything \u2014 like your voice assistant doesn\u2019t work with everything \u2014 because there are infinite possible requests and Google\u2019s still figuring out intent. Should the system pay more attention to the picture or your text search if they seem to contradict? Good question. For now, you do have one additional bit of control: if you\u2019d rather match a pattern, like the leafy notebook, get up close to it so that Lens can\u2019t see it\u2019s a notebook. Because remember, Google Lens is trying to recognize your image: if it thinks you want more notebooks, you might have to tell it that you actually don\u2019t. <\/p>\n

Google is hoping AI models can drive a new era of search, and there are big open questions about whether context<\/a> \u2014 and not just text<\/a> \u2014 can take it there. This experiment seems limited enough (it doesn\u2019t even use its latest MUM AI models) that it probably won\u2019t give us the answer. But it does seem like a neat trick that could go fascinating places if it became a core Google Search feature.<\/p>\n