[ad_1]

Apple had its large occasion yesterday, and in the course of the iPhone discuss, they introduced Apple Visible Intelligence. The place you’ll be able to take photographs of something, and it’ll use Apple Intelligence for native search, buying, homework assist, and extra.

Apple is utilizing Google, ChatGPT and I feel possibly Yelp and OpenTable with Apple Maps for these integrations. And sure, this appears so much like Google Lens…

This a part of the discuss begins at across the 57 minute mark however let me share screenshots of this.

Here’s a man taking a photograph of a restaurant’s entrance to study extra concerning the restaurant – I feel on Yelp and OpenTable?

Apple Visual Intelligence Camera Local

Listed below are the outcomes:

Apple Visual Intelligence Camera Local2

Then this one is taking a photograph of a motorcycle, to go looking Google for that product and pricing:

Apple Visual Intelligence Camera Google Shopping

The outcomes look tailor-made:

Apple Visual Intelligence Camera Google Shopping 2

After which getting assist with homework utilizing ChatGPT:

Apple Visual Intelligence Camera Chatgpt

Right here is the video embed initially time, if you wish to watch:

So Apple is implementing AI as instruments, primarily as built-in apps.

Ideas?

Discussion board dialogue at X.

[ad_2]

Source link

Comments are closed.