RT @ClaireSilver12: stable diffusion 2.1 is pushing the ceiling on the uncanny valley #AIart https://t.co/zmlQnHQOVB
1613891440194486277
Curation vs platform mindset + algorithmic selection. The position of a curator is one of incredible power and incredible value. I learned it 20y ago, when I built virtualization[.]info
#AI is not making curation better. Yet.
The platform and the curator https://t.co/CZMfPTqaRU
1613882977934610432
That’s why asking an analysis firm to talk only about success stories is pure madness. https://t.co/Djh9mKRn21
1613870906224304128
Right now, the totality of mobile and desktop apps that feature a large language model require a subscription (usually necessary to pay the API calls to OpenAI). The first one that will find a way to offer access to GPT-x for free, will have a huge competitive edge.
1613870908871159808
(at least until @StabilityAI releases its own version, unlocking free access for all)
1613865553327710208
@camenduru Thanks for this great upgrade on an already exceptional collection of #stablediffusion notebooks.
As you surely know, people can speed up the booting a bit if they mount their GDrive and inside it there is already a model. https://t.co/WyVwkeFb4l
1613869287982223361
Recently, open source pioneer and developer extraordinaire @migueldeicaza launched a completely free SSH terminal for iOS. Well, it’s just been updated and now it features an #AI assistant powered by #gpt3. For free. No in-app purchases. https://t.co/zHkLZekMs4
1613865746047668224
@camenduru I just wish there was a way to save CivitAI models straight into the mounted GDrive folder. That would speed up the process even more.
1613680946821140481
@nikitabase A LLM, accessible via a chat interface, fundamentally reduces to zero the friction in formulating the question and significantly reduces the friction in gathering the answer. But you still have to come up with the question and, on this, I’m sceptical about humans.
1613677719367081985
RT @olivercameron: We’ve all rightfully been taken by LLMs and how impressive the current state-of-the-art is.
But, don’t sleep on compute…