This is going to start creeping into absolutely everything. A human will no longer be required to do a job here, then there — oh no, Becky from accounts is retiring, taking her forty years of experience with her... wait, we don't need to replace her, this new AI thing can do her job with greater accuracy and infinitely more efficiency... if we take the job off payroll we can invest her salary back into the company (might just use it for bumper dividends this year as I'm a shareholder lol).
This to me highlights the sheer amount of bullshit work that is done (or created), only it will finally be understood as such by getting something else to generate it.
Personally I get really strawberry floating tired of endless documentation of meetings with obvious action points but nobody will write them down or complete the menial work necessarily to get to a certain result (i.e. prevarication and analysis paralysis, creative delaying). Decisions have to be made on multiple layers by multiple people who either make lazy decisions or defer judgement to somebody else (no key decision maker), or selling something to a customer who doesn't know what they want and won't tell you (so that you can then just give them what they want). In other words, bad leadership in any situation, which leads to bullshit work justifying other bullshit work. When a single intelligent creative person could probably do the whole damn thing given the authority to do it. Or an AI. Whatever, as long as this entire field of made up work strawberry floats off forever.
People incapable of creative thoughts or problem solving will struggle and well, about time. Why anyone would want to consistently output mundane tasks, copy information from one good format into another one for no reason besides someone else thinks that looks better but hasn't objectively evaluated the merit of the output, and just generated a pointless task, and write-ups etc is beyond me.
The amount of pointless work people do in office jobs is absolutely astonishing. I think it will bring more value to invention, creative arts and sciences which is a good thing. Then maybe all the people who come up with ideas, products and solutions ultimately being sold, but get no credit for that or are paid poorly, or working in bad conditions, because of budgets allocated to bullshit work instead, will actually become valued.
Oblomov Boblomov wrote:AI will meet and quickly surpass human intelligence. Machines will meet and quickly surpass (in many ways they already have, for several decades) human capability.
Combine the two, and what task would you put a human on for any reason other than not having the resource available to get AI to do it instead?
The only jobs humans will be needed for are the ones deemed too menial and unimportant to waste money on by installing an AI workforce. As the cost of installing this AI workforce goes down, and the capability of employers to do so goes up, the sphere of human jobs will shrink indefinitely.
I don't think this is true. It's not really been demonstrated yet that an AI can come up with compelling, meaningful, philosophising and spiritual art for example, or design and test a good household product. Something that is not an amalgamation or copies.
Oblomov Boblomov wrote:AI will meet and quickly surpass human intelligence. Machines will meet and quickly surpass (in many ways they already have, for several decades) human capability.
Combine the two, and what task would you put a human on for any reason other than not having the resource available to get AI to do it instead?
The only jobs humans will be needed for are the ones deemed too menial and unimportant to waste money on by installing an AI workforce. As the cost of installing this AI workforce goes down, and the capability of employers to do so goes up, the sphere of human jobs will shrink indefinitely.
I don't think this is true. It's not really been demonstrated yet that an AI can come up with compelling, meaningful, philosophising and spiritual art for example, or design and test a good household product. Something that is not an amalgamation or copies.
Not demonstrated yet, you're right.
We like to believe that humans create things using the power of their souls, or some sort of spiritual connection with the universe, or whatever, but actually the creative output of any human who has ever lived has been the result of their brainpower — the manifestation of their intelligence. Intelligence is just the extraordinarily complex result of data flying around in our brains. Except, it's no longer extraordinary, and AI brains have essentially limitless potential to harness it.
I think you're talking about some kind of event horizon or nexus at which point a mass of neural networks / AIs can amount to the extreme complexity of an individual and their ability to respond intuitively to stimulus and synthesise that as something like objet des artes or a design.
However, I am also coming at it from the perspective that creativity/synthesis born from learned processes and intelligence (I don't believe much in talent, for example, art is mastered) is an inherently not original process created in part by combining existing ideas or datums that originated in some other way. No art can be truly original for art is not conceived nor does it exist in an intellectual or aesthetic vacuum.
Despite those perspectives being in alignment, one thing being overlooked is the spiritual and philosophical worth of something created by an AI compared to a human based largely on their lived experience and the civic context, i.e. in a community, that brings meaning and value (post supposed death of the author). Does an AI have lived experience or is it an amalgamation of the lived experience of humans or indeed other AIs?
If someone does not know something was created by an AI or an artist for example instructed an AI to create the art for them, does it matter whether it was human-made or not? In the framework of my personal theory I was discussing in my undergrad thesis, if an artist appropriates or creates the devices that themselves created art, then it is the same as "their" art, while also standing on its own merits distinguished from the author, whoever claims authorship, i.e. doesn't matter.
So the real question I think posed by your line of enquiry is not whether AI can create art or designs, in retort to my post, but whether humans care if it is made by an AI or not and how that affects their valuation or reading of the work itself.
Will humans value AI art or designs as they do human ones? In my opinion, even if I don't agree, no, I don't think so. While what you say is true, that people tend to romanticise the act of creation as something spiritual or philosophical or pontifical in some way - the romantic genius view on art which I find preposterous and in which both audiences and artists should humble themselves, for they are not that special - that doesn't mean people respond or question their own biases, mindsets and frameworks in which they have come to know art or criticise it.
In essence, you would probably have to have an AI pass the Turing test or double blind test art created by artists and AIs to determine if something has the same cultural value when made by an AI. Because culture is by definition a population's collective response to, or understanding of, the work and activities of other human beings.
And when an AI's dataset is based upon billions of pieces of human made artwork, they will suffer from exactly the same as any art student does; their work is simultaneously judged on a nonsensical balance of originality and relevance/research into reference artworks: an impossible standard, as both are required while being diametrically opposite to one another.
Edit: I'd also say it's not just their raw brainpower, it's the lens through which they experience and decode life, and then reassemble it in the frontal cortex and some other areas. An AI cannot possess such a lens unless it keeps bank and develops its own, in which case, that may only be relevant to other AIs that experience the world the way an AI does, if it experiences it. You could transplant an artist's lens or view of the world somehow to an AI, but then it is effectively an artistic clone of the original human. It would have to be capable of, say, only 5% derivative thoughts to be on par with a human that is inspired by another human's view on the world. You could call this their philosophical outlook or musings upon life that inspire their creations.
So then we have AIs making art that is valuable or somehow makes the most sense to other AIs. What does that look like? If AIs dream, can they interpret each other's dreams collectively or is it all just meaningless gibberish? Do AIs dream in binary? Dreaming is fundamentally connected to artistic creation. Most ideation is basically dreaming in the daytime. Some artists create work based on their dreams. So can AIs dream? I'm not talking about a trademark [Google's DeepDream] here, I'm talking about the unconscious act of dreaming.
Can an AI dream unconsciously? If an AI is not conscious, then is dreaming all it does? Is that by definition artwork because a lot of artwork is created by accident, as a consequence of simply living with an intrinsically deep relationship to what we observe around us? Most artists don't consciously create (most designers should), they do so when they don't expect it and then they capitalise on the results as the conscious exercise - we call this a discipline or "practice" in the arts because it is a choice, and it takes practice to make art useful and apply it (you can't do that by mistake).
Green Gecko wrote:Can an AI dream unconsciously? If an AI is not conscious, then is dreaming all it does?
One explanation of dreams that I find compelling is that it's a way for the brain to remix experienced events into novel scenarios to refine its predictive ability and help in abstract modelling of concepts. That is pretty much what a lot of modern AI is doing, yeah, just prompted rather than however the brain decides what to dream about (which is still effectively prompted, just less directly).
Right now, an AI isn't a virtual person, even in a limited capacity. It's a mechanism which can automate tasks requiring conceptual reasoning. Broadly, AI is doing for ideas what the power loom did for weaving.
In the short term, I don't think the disruption to creative fields will be projects wholly authored by AI -- it will be that the creative process will be heavily assisted by AI. Perhaps one artist will be able to do the work of a whole animation studio. (In a socialist society this would be wonderful news -- in our society, well, better join a union...)
In the long term, I think we absolutely will build something that exhibits "will" or "consciousness." Current projects are establishing a lot of the underpinnings. For example, GPT-4 has the ability to fluently interpret an image in terms of language, translating the depicted objects to descriptive concepts -- which is not "thinking about what it sees" but is a meaningful step towards a system that does that.
I think that's very interesting to think about, but it will be another paradigm shift which is separate to the one happening right now, with its own effects and implications.
site23 wrote:Right now, an AI isn't a virtual person, even in a limited capacity. It's a mechanism which can automate tasks requiring conceptual reasoning.
site23 wrote:Right now, an AI isn't a virtual person, even in a limited capacity. It's a mechanism which can automate tasks requiring conceptual reasoning.
To be fair to AI, that pretty much describes me.
You are like a power loom but for The Politics Thread
site23 wrote:Right now, an AI isn't a virtual person, even in a limited capacity. It's a mechanism which can automate tasks requiring conceptual reasoning.
To be fair to AI, that pretty much describes me.
You are like a power loom but for The Politics Thread