Prof. Michael Klein’s Post

View profile for Prof. Michael Klein, graphic

Professor for Digital Film Design – Animation/VFX at Mediadesign University of Applied Sciences

Prof. Tim Bruysten, society's greatest concern and threat has begun. This image of the tortured Julian Assange is being uncritically shared hundreds of times without citation of the source on Twitter right now. There are still features in the image, such as strange colour gradients, unnatural hairlines and the incomplete and false tiling of the wall, that expose an AI image. I assume this was created by Midjourney 5. And now thousands of people foaming with rage don't see that. How do we deal with this when these images are generated to perfection?

  • No alternative text description for this image
Luke Hayne

CEO, Senior 3D Technical Artist @ Photini By Design Ltd

1y

The thing with people, we can often take things at face value whether it be a deepfake, YouTube video, social media posts and even mainstream media. We as a species can often inline ourselves with something that is presented to us if it triggers or falls inline with our emotions which stem from concerns / facts / validations / fears and inspirations regardless if it is false or not. The idea of "is or is not" or any other form of polarized inner and outer conflict will often reject notion of "Could Be and Could Not" And even when one applies the mechanics of a NAND Gate table, we can still find ourselves lost in a matrix of a external and internal distorted perceptual reality. Our only real truth comes from a deep inner state of peace where emotion and bias cease to exist.

Prof. Tim Bruysten

Future’s beauty liberates.

1y

Yes, that is not so easy to evaluate.  Maybe it is a fake picture, which also does not represent reality. Maybe it is a real picture, which gives a false impression of reality. Maybe it is a fake picture, but it shows what really happened. (Most likely, it's a fake picture, as you said, Prof. Michael Klein) Faking pictures - even in "photo-realistic" quality is as old as photography itself. Anyone who has ever exposed a picture in a wet lab knows that it didn't even require Photoshop. And this is not the only reason why we scientists flinch when someone talks about truth. What do you think will be going on in courtrooms, insurance appraisals, detective agencies, or on Tinder now?  Reality is something for beginners and we will have to learn as individuals, as society, as companies and as politics to update the terms knowledge, facts, truth, reality, etc. to the level at which Weber, Husserl, Popper, Kuhn, Mill, Feierabend or Paul Hoyningen-Huene were already decades ago. (You can also put the old Plato with his cave allegory into the list). Society will only be able to tackle this with genuine education, if at all. Not with their widespread function-optimizing surrogate.

Harald Oehlerking

Freelancer Illustration und Grafik

1y

If this is fake, and there are many parts in the picture to underline this, as Michael pointed out, then the real problem is, that a person with some knowledge in Photoshop could have easily corrected the AI mistakes. People that are saying that AI is no match for the human brain forget that it has a hell of a learning curve since these program were released, and it will get a real problem when it gets a sort of a Renaissance moment and discovers how perspective, scale and human bodies actually work and interact. It’s not a question of how fake something is, but if it can be made so perfect that not even a pro can tell.

Like
Reply
Max Zimmermann

Co-Founder & Designer bei FIFTYEIGHT PRODUCTS

1y

i am glad that i can still see the AI midjourney aesthetics due to smooth skin tones and the hair looks too perfect. But for how long?

See more comments

To view or add a comment, sign in

Explore topics