
Sign up to save your podcasts
Or


Adobe Firefly is a generative AI model that Adobe has been developing for years. In collaboration with Nvidia, Adobe has been at the forefront of this technology, especially since the decline of crypto mining. Nvidia anticipated a shift in their product usage, predicting that as people moved away from buying GPUs for crypto mining, they would start purchasing them to train AI models. This prediction has proven accurate.
Most large language models and generative AI systems don't have a unique "secret sauce." They rely on vast training datasets of images paired with text or text with classification, supplemented by human feedback. The key components are substantial financial investment, a large team, and extensive training data. This is why, shortly after the release of models like DALL-E, others like Stable Diffusion emerged. These models aren't created in isolation; they are developed by those who can interpret open AI research papers and gather the necessary resources.
Adobe and Nvidia, along with other companies, have been collaborating for years to generate AI services revenue. Nvidia is transitioning to become a services company alongside its hardware business. This shift isn't happening in secret; information is available through press releases and on their websites.
Adobe Firefly isn't designed to compete directly with models like DALL-E or Stable Diffusion by selling API access. Instead, it's part of a broader transition that every tech company will undergo. New spaces and markets are emerging, and companies must decide whether to build or integrate AI technology. This shift is akin to the rise of mobile and cloud technologies, offering a competitive advantage.
Adobe's product managers are keenly aware of the competitive landscape. Most Adobe products cater to creative professionals, and the first applications of their Sensei AI will empower creatives to achieve previously impossible tasks. This technology isn't about replacing creativity but augmenting it. For instance, Adobe Stock's vast image library could be complemented by AI-generated images, offering a new business line.
Consider a future version of Photoshop paired with a large language model, an advanced iteration of something like Stable Diffusion. This could enable in-painting and photo editing on the fly, without requiring users to master complex commands or tutorials. The ability to interact with such a tool through speech or simple typing could revolutionize creative workflows.
Adobe recognizes the importance of being at the forefront of this technological shift. By developing elements of these advanced tools, Adobe aims to preemptively compete against potential rivals. This strategic foresight ensures Adobe remains a leader in the creative software market, ready to meet the demands of the future.
By Indie.am5
11 ratings
Adobe Firefly is a generative AI model that Adobe has been developing for years. In collaboration with Nvidia, Adobe has been at the forefront of this technology, especially since the decline of crypto mining. Nvidia anticipated a shift in their product usage, predicting that as people moved away from buying GPUs for crypto mining, they would start purchasing them to train AI models. This prediction has proven accurate.
Most large language models and generative AI systems don't have a unique "secret sauce." They rely on vast training datasets of images paired with text or text with classification, supplemented by human feedback. The key components are substantial financial investment, a large team, and extensive training data. This is why, shortly after the release of models like DALL-E, others like Stable Diffusion emerged. These models aren't created in isolation; they are developed by those who can interpret open AI research papers and gather the necessary resources.
Adobe and Nvidia, along with other companies, have been collaborating for years to generate AI services revenue. Nvidia is transitioning to become a services company alongside its hardware business. This shift isn't happening in secret; information is available through press releases and on their websites.
Adobe Firefly isn't designed to compete directly with models like DALL-E or Stable Diffusion by selling API access. Instead, it's part of a broader transition that every tech company will undergo. New spaces and markets are emerging, and companies must decide whether to build or integrate AI technology. This shift is akin to the rise of mobile and cloud technologies, offering a competitive advantage.
Adobe's product managers are keenly aware of the competitive landscape. Most Adobe products cater to creative professionals, and the first applications of their Sensei AI will empower creatives to achieve previously impossible tasks. This technology isn't about replacing creativity but augmenting it. For instance, Adobe Stock's vast image library could be complemented by AI-generated images, offering a new business line.
Consider a future version of Photoshop paired with a large language model, an advanced iteration of something like Stable Diffusion. This could enable in-painting and photo editing on the fly, without requiring users to master complex commands or tutorials. The ability to interact with such a tool through speech or simple typing could revolutionize creative workflows.
Adobe recognizes the importance of being at the forefront of this technological shift. By developing elements of these advanced tools, Adobe aims to preemptively compete against potential rivals. This strategic foresight ensures Adobe remains a leader in the creative software market, ready to meet the demands of the future.