- USDT(TRC-20)
- $0.0
When Apple announced the iPhone 16E yesterday, it also confirmed that the new budget phone will get Apple Intelligenceās āVisual Intelligenceā feature, marking the first time the AI trick will come to a phone without a āCamera Controlā button. While the other iPhone 16 series phones use their Camera Control buttons to access Visual Intelligence, the iPhone 16E can instead map it to its Action Button, a simple change that raises the question: why not the iPhone 15 Pro, too?
Personally, as an iPhone 15 Pro owner, Iāve been asking that question for months now, as Iāve long suspected my phoneās internals were definitely capable of itāit can run every other Apple Intelligence feature without issue. It instead seemed to me like Apple was arbitrarily holding the feature back because it wanted to tie it to a specific button press I didn't have. Well, with the iPhone 16E adopting the Action Button workaround, it seems like Appleās finally listening. Apple representatives have now confirmed that Visual Intelligence will be coming to the iPhone 15 Pro as well, using the same strategy.
Speaking to Daring Fireballās Jeff Gruber, an Apple spokesperson said that the iPhone 15 Pro will indeed get Visual Intelligence āin a future software update,ā and that users could map it to the Action Button. Sweet vindication.
Thereās no word on when exactly that software update will come, and to be honest, Iām not sure if Iāll use Visual Intelligence much, but itās encouraging to see my phoneās software not get held back by an arbitrary push for hardware cohesion anymore.
For the uninitiated, Visual Intelligence brings AI to your iPhoneās camera. You can point your camera at a foreign language menu, for instance, to get a translation, or point it at a book to get a summary of whatās on the page, or point it at a dog to try to find out what breed it is. It can also surface information about businesses simply by looking at their storefront or signage (in the United States only), and works with Google and ChatGPT for extended search queries. In other words, it's similar to Google Lens, but puts AI first and is built into your operating system. Again, Iāve been prevented from playing around with it much, but hey, at least I now have the option.
Full story here:
Personally, as an iPhone 15 Pro owner, Iāve been asking that question for months now, as Iāve long suspected my phoneās internals were definitely capable of itāit can run every other Apple Intelligence feature without issue. It instead seemed to me like Apple was arbitrarily holding the feature back because it wanted to tie it to a specific button press I didn't have. Well, with the iPhone 16E adopting the Action Button workaround, it seems like Appleās finally listening. Apple representatives have now confirmed that Visual Intelligence will be coming to the iPhone 15 Pro as well, using the same strategy.
Speaking to Daring Fireballās Jeff Gruber, an Apple spokesperson said that the iPhone 15 Pro will indeed get Visual Intelligence āin a future software update,ā and that users could map it to the Action Button. Sweet vindication.
Thereās no word on when exactly that software update will come, and to be honest, Iām not sure if Iāll use Visual Intelligence much, but itās encouraging to see my phoneās software not get held back by an arbitrary push for hardware cohesion anymore.
For the uninitiated, Visual Intelligence brings AI to your iPhoneās camera. You can point your camera at a foreign language menu, for instance, to get a translation, or point it at a book to get a summary of whatās on the page, or point it at a dog to try to find out what breed it is. It can also surface information about businesses simply by looking at their storefront or signage (in the United States only), and works with Google and ChatGPT for extended search queries. In other words, it's similar to Google Lens, but puts AI first and is built into your operating system. Again, Iāve been prevented from playing around with it much, but hey, at least I now have the option.
Full story here: