WYSIWYG is a concept that pre-dates the web and what this article is talking about is not the same thing. WYSIWYG was coined as a term to describe word processing and desktop publishing software where the appearance of your text matched the final printed output; the same fonts, weights, sizes, styles, etc. That's it.
It's something we mostly take for granted today but was a real advancement over earlier, often text-based, programs that used simple text effects like highlighting or different colors to represent visual effects that were only fully realized when you printed your document.
It has nothing to do with being able to view source, or copy other designs, or any of that.
WalterBright [3 hidden]5 mins ago
WYSIWYG came about when displays became bit-mapped graphics with a sufficient amount of dots per inch.
Previously, displays used a character generator ROM chip which mapped ASCII onto one character. For a terminal I designed and built in those days, I used an off-the-shelf character generator chip which had a 5x7 font.
The original IBM PC used a character generator.
layer8 [3 hidden]5 mins ago
The term was later also extended to things like visual GUI builders, where the appearance in the editing interface matches the appearance of the final GUI (e.g. the Visual Basic form editor). This specific WYSIWYG variation mostly hasn't returned, unfortunately.
skissane [3 hidden]5 mins ago
> It's something we mostly take for granted today but was a real advancement over earlier, often text-based, programs that used simple text effects like highlighting or different colors to represent visual effects that were only fully realized when you printed your document.
I am tasked with maintaining documentation in Confluence and Notion-and I wasn’t enjoying it. Then I built a system with bidirectional sync between the two of them and a Git repo full of Markdown documents-and now I find the task to be much more pleasant.
anonymous908213 [3 hidden]5 mins ago
Aside from the LLM writing vibes, or perhaps because it was written by an LLM, I think this article has very little tether to reality.
> It’s bringing back something we collectively gave away in the 2010’s when the algorithmic feed psycho-optimized its way into our lives: being weird.
It's really not. Prompting an LLM for a website is the exact opposite of being weird. It spits out something bland that follows corporate design fads and which contains no individuality. If you want to see weird websites, people are still making those by hand; the recently posted webtiles[1] is a nice way to browse a tiny slice of the human internet, with all its weirdness and chaotic individuality.
> It's really not. Prompting an LLM for a website is the exact opposite of being weird. It spits out something bland that follows corporate design fads and which contains no individuality. If you want to see weird websites,
I see your point, but I disagree. You consider part of the "weirdness" of being how it's done; and yes, it is indeed "weird" to learn several languages, consisting mostly of punctuation, in order to create an online self-promotion. But I think for most people, the "weirdness" (or its absence) is to be found in the end result. To that end, if a person wants a personal web page with animated tentacles around the edges and flying bananas in the background and pictures of demonic teddy bears, that is something that an AI can easily do when asked.
bigbuppo [3 hidden]5 mins ago
Back in the bad old days, people created websites because they had no choice in the matter. You simply had to do that to share anything with the rest of the world. Most of the tools we had back then still exist. The barrier to entry has never been lower, and those that are motivated to tinker do just that. But going through history... once mainstream blogging became a thing, and then social media conquered all, the motivation to share with others became monetized, as did the methods of sharing with others. AI isn't going to fix that. On the flip side, those same monsters that destroyed the world we knew through monetizing everything are the same ones spending trillions of dollars on AI.
WalterBright [3 hidden]5 mins ago
> those same monsters that destroyed the world we knew through monetizing everything
That's why we get to use google for free.
I use a ton of excellent free software.
dtgriscom [3 hidden]5 mins ago
I've always wanted a DWIMNWIS code editor: "Do What I Mean, Not What I Say". These days it's likely that AI at least tries to provide this.
niko_dex [3 hidden]5 mins ago
This reads like a love letter to our collective youth. I like the perspective! It's interesting too, because I feel a lot of programmer types might see WYSIWYG and AI both as stepping stones towards a more disciplined approach to engineering.
bluedino [3 hidden]5 mins ago
No mention of Geocities?!
mempko [3 hidden]5 mins ago
Read it again.
kylehotchkiss [3 hidden]5 mins ago
> The barrier to entry is lower than it’s ever been.
I don't see a web full of projects created by people who aren't technical. A substantial number of young people grew up on phones and iPads and might not even understand filesystems well enough to have the imagination to create things like this. So the power exists, but the people who are taking best advantage to me seem like the people who were building stuff before the LLMs came to be.
hackyhacky [3 hidden]5 mins ago
> I don't see a web full of projects created by people who aren't technical.
Sure, but this is very new technology. It will take some time for the idea of building software easily to seep into the public consciousness. In that time, AI will get better and the barrier to entry will get even lower.
For comparison, the internet has been around in some form since the 1960s (more-or-less: depending on the exact technology that you consider to represent its beginning), but it took until the late 1990s or even early 2000s before most people were aware of it, and longer than that before it became central to their lives. I would expect the development of AI-coding-for-the-masses to happen much faster, but not instantaneously.
bigbuppo [3 hidden]5 mins ago
So the internet is newer than AI?
dismalaf [3 hidden]5 mins ago
What's a "project"? How about a Shopify store? A Substack or WordPress site?
It's something we mostly take for granted today but was a real advancement over earlier, often text-based, programs that used simple text effects like highlighting or different colors to represent visual effects that were only fully realized when you printed your document.
It has nothing to do with being able to view source, or copy other designs, or any of that.
Previously, displays used a character generator ROM chip which mapped ASCII onto one character. For a terminal I designed and built in those days, I used an off-the-shelf character generator chip which had a 5x7 font.
The original IBM PC used a character generator.
I am tasked with maintaining documentation in Confluence and Notion-and I wasn’t enjoying it. Then I built a system with bidirectional sync between the two of them and a Git repo full of Markdown documents-and now I find the task to be much more pleasant.
> It’s bringing back something we collectively gave away in the 2010’s when the algorithmic feed psycho-optimized its way into our lives: being weird.
It's really not. Prompting an LLM for a website is the exact opposite of being weird. It spits out something bland that follows corporate design fads and which contains no individuality. If you want to see weird websites, people are still making those by hand; the recently posted webtiles[1] is a nice way to browse a tiny slice of the human internet, with all its weirdness and chaotic individuality.
[1]https://webtiles.kicya.net/
I see your point, but I disagree. You consider part of the "weirdness" of being how it's done; and yes, it is indeed "weird" to learn several languages, consisting mostly of punctuation, in order to create an online self-promotion. But I think for most people, the "weirdness" (or its absence) is to be found in the end result. To that end, if a person wants a personal web page with animated tentacles around the edges and flying bananas in the background and pictures of demonic teddy bears, that is something that an AI can easily do when asked.
That's why we get to use google for free.
I use a ton of excellent free software.
I don't see a web full of projects created by people who aren't technical. A substantial number of young people grew up on phones and iPads and might not even understand filesystems well enough to have the imagination to create things like this. So the power exists, but the people who are taking best advantage to me seem like the people who were building stuff before the LLMs came to be.
Sure, but this is very new technology. It will take some time for the idea of building software easily to seep into the public consciousness. In that time, AI will get better and the barrier to entry will get even lower.
For comparison, the internet has been around in some form since the 1960s (more-or-less: depending on the exact technology that you consider to represent its beginning), but it took until the late 1990s or even early 2000s before most people were aware of it, and longer than that before it became central to their lives. I would expect the development of AI-coding-for-the-masses to happen much faster, but not instantaneously.