What Apple will present an artificial intelligence model in June This is something we all expected given the progress of the competition. However, this is something more than confirmed by the pills that its CEO, Tim Cook, has launched.
Now we need to know What will this artificial intelligence look like?. In my personal case, I am clear about the AI that I would like to see in my Apple products like the iPhone or the Mac. It is a model that, apart from the fact that I do not consider it unreasonable for it to arrive in June, Apple has already presented it with MGIE .
“I give you an iPhone to take good photos and this AI to improve them”
This phrase could very well be the artificial intelligence slogan I expect from Apple at the next developers conference. The whole thing is based on a model similar to the aforementioned MGIE, a model that Apple presented a few weeks ago in collaboration with the University of California at Santa Barbara.
MGIE comes to bring advanced photo editing features. Not a generation like Midjourney or DALL-E 3, but being able to edit photos already taken by giving a few simple instructions. Therefore, I imagine Apple is making the most of its iPhone cameras, which it always has, and announcing an add-on to make them even better.
Give a simple instruction and let the AI know what to do and transform our photographs
MGIE is similar to what Photoshop’s generative AI offers. In Adobe, we can take a simple photograph of a plate of food and ask them to put a loaf of bread next to it. Even a scoop of ice cream. And others too even more advanced functions like inventing the environment. For example, put a photograph of a beach in vertical format and ask it to be enlarged horizontally, thus producing a scene that is completely realistic and conforms to the original cut.
Well, MGIE already does this to some extent, but in a different way, by making there is no need to make a very specific prompt. “Prompt” is what we call the instructions that we give to a generative AI and which in certain models need to be very precise. MGIE shows clear examples of its simplicity, such as having a picture of a pizza and asking it to make it healthier. AI understands toppings of vegetables as elements that would make the pizza healthier and add it.
And this is also valid for change settings such as brightness or eliminate noise. The example presented here asks you to make the sky bluer in a photograph and MGIE to analyze this prompt and the context of the image to know that the instruction to apply is to increase the saturation by 20%.
Regardless, MGIE as such won’t be what Apple adds in iOS 18, macOS 15 and company. Not at least as it was launched, since it is still a fairly modest project with online access that cannot yet be considered a permanently accessible tool.
However, it may be an important clue of what is to come. AI-driven changes to Siri are also expected, although in my opinion this is the area where I most want to see Apple’s progress. My hunch is that they will comply, although I (we) will not be aware of any doubts until June.
Cover image | Álvaro García M. with DALL-E 3
In Applesfera | Apple’s latest purchase makes its big bet for 2024 very clear: artificial intelligence
In Applesfera | Can you download the Google Play Store on an iPhone? This is the case with iOS 17.4