Skip Navigation

InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)FL
Posts
0
Comments
305
Joined
1 yr. ago

  • I hate Taylor because she is a performative liberal who espouses progressive politics but doesn’t follow through making her a hypocrite.

    I'm curious, I thought she wasn't very political (just basic human decency stuff). What does she espouse and doesn't follow through with?

  • Are they insane? This is not just cruel and disgusting, it's also incredibly stupid. This will seriously damage their reputation among other nations.

    Damaging another nations food supply is a crime against humanity.

  • No worries, I'll have to research this stuff later. My ideal / current dream would be something like all the walls of a house (or catamaran) are about 80cm deep greenhouses with aquaculture crops that produce strawberries or similar supplemental crops.

  • Nice post. A while back I read something on reddit about a theory for technological advances always being used for the worst possible nightmare scenario. But I can't find it now. Fundamentally I'm a technological optimist but I can't even yet fully imagine the subtle systemic issues this will cause. Except the rather obvious one:

    Algorithms on social media decide what people see and that shapes their perception of the world. It is relatively easy to manipulate in subtle ways. If AI can learn to categorize the psychology of users and then "blindly anticipate" what and how they will respond to stimuli (memes / news / framing) then that will in theory allow for a total control by means of psychological manipulation. I'm not sure how close we are to this, the counter argument would be that AI or LLMs currently don't understand at all what is going on, but public relations / advertising / propaganda works on emotional levels and doesn't "have to make sense". The emotional logic is much easier to categorize and generate. So even if there is no blatant evil master plan just optimizing for max profit, max engagement, could make the AI pursue (dumbly) a broad strategy that is more evil than that.

  • Have you tried out writing prompts for an image generating AI? If you have some idea and play around with it it's quite a new thing. And extension of human imagination. YMMV

    AI is helping us to correctly predict protein folding which will enable new medication. Afaik it's a major breakthough that could allow alleviating a lot of suffering.

  • Yeah. I think there is a kind of power grab under way. Social media will try to push that they own the IP rights to the large texts uses for LLM. This will then require that producers of LLM software aquire the licensing rights which will cost many millions which in turn restricts the free use of LLM and in general any AI software that requires training data.

    The end result is that as the "means of production" become less based on human work the "means of generation" and AI will be controlled by the capitalists. If you can turn something into a commodity (like knowledge with patents and IP) you can control it. Leading to a darker timeline.

  • Ah thanks I've been meaning to look into plants, how much you'd need to make an impact. Unfortunately the article doesn't mention numbers, like how much CO2 one human produces in a normal or in a well insulated sealed room, and how much of that plants can scrub, and how much light they need.

    Another way to think about this would in terms of calories, if plants where to produce 100% calories / edible sugar or starches, it should somewhat match how much CO2 we output. To grow enough potatoes for your daily calories you need about 250m². You probably need less and that would be for a perfectly sealed room.

    I don't know about blood concentration, but if it's fine then presumably it isn't affecting cognitive abilities yet. It might also vary from human to human how well they can oxygenate / expel Co2.

  • Current AI can not conceptualise – much less realise – ideas, and so they can not be creative or create art by any sensible definition.

    I kinda 100% agree with you on the art part since it can't understand what it's doing... On the other hand, I could swear that if you look at some generated AI imagines it's kind of mocking us. It's a reflection of our society in a weird mirror. Like a completely mad or autistic artist that is creating interesting imagery but has no clue what it means. Of course that exists only in my perception.

    But it the sense of "inventive" or "imaginative" or "fertile" I find AI images absolutely creative. As such it's telling us something about the nature of creative process, about the "limits" of human creativity - which is in itself art.

    When you sit there thinking up or refining prompts you're basically outsourcing the imaginative visualizing part of your brain. An "AI artist" might not be able draw well or even have the imagination, but he might have a purpose or meaning that he's trying to visualize with the help of AI. So AI generation is at least some portion of the artistic or creative process but not all of it.

    Imagine we could have a brain computer interface that lets us perceive virtual reality like with some extra pair of eyes. It could scan our thoughts and allows us to "write text" with our brain, and then immediately feeds back a visual AI generated stream that we "see". You'd be a kind of creative superman. Seeing / imagining things in their head is of course what many people do their whole life but not in that quantity or breadth. You'd hear a joke and you would not just imagine it, you'd see it visualized in many different ways. Or you'd hear a tragedy and...

  • Lol you don't know how cruel that is. For decades programmers have devoted their passion to creating hyperrealistic games and 3D graphics in general, and now poof it's here like with a magic wand and people say "yeah well you should have made your 3D engine look like the real world, not to look like shit" :D

  • Sorry I don't have good sources. I just read this a few years back with one study showing there is already an effect.

    While you'd probably need to study this further to be 100% certain, I don't think it's much of a stretch to assume ventilation is going to work worse because a constant air exchange would move concentrations towards 400ppm instead of 280ppm. So the median (or whatever) can be higher than 100ppm difference. Or maybe I'm getting this wrong?