Because the replacement comes from non-graybeards in FOSS, and their replacement from without-beards in FOSS, and they come from youths in FOSS, and they from teens geeking around with computers, and oops - teens are not geeking around with computers, they are watching reels and scrolling recommendations and doing other bullshit. If they have a PC, it’s an unloved work tool for them, with crappy bloated Windows, crappy bloated software for work and studies, not always crappy, but bloated games, you get the idea.
Because there was a generation very fertile on geeks. It’s going away. There are demographic pits and there are demographic, what they call them, hills? The point is, we are seeing the effects of the latter.
and they from teens geeking around with computers, and oops - teens are not geeking around with computers, they are watching reels and scrolling recommendations and doing other bullshit.
“Youth bad.” Lazy take. As if everyone in the gray beard generation was tinkering around with computers? Plenty of youths still tinker. Posting condescending shit like this is just going to turn them off from pursuing/contributing.
I don’t see anything that could be considered a “Youth bad” statement in that comment. It’s a complex issue, influenced by a myriad of factors.
For example, I could dissasemle and reassemble my first PC without any prior knowledge. I had to learn to use DOS to navigate the OS and get things done. I got a book from the library about it, and spent hours upon hours just learning about how the file structure, commands, programs, external media, etc. worked before I could do anything remotely useful.
Today a PC/tablet/phone is a black box, you have to actively WANT to tinker in order to learn such about how they work. And most big tech companies try to punish you for so much as trying to replace a battery yourself.
I suspect you are projecting some personal feelings onto a stranger’s comment.
They aren’t just a black box, tinkering with them actually has negative consequences. On Android getting root access results in Safetynet attestation failing and on iOS you can’t even get root at all unless you are happy to run some bins from questionable sources. Things are very different for youths. As someone else stated to tinker in the way you mentioned means getting an Arduino or some kind of tinker-friendly SBC.
I’m really confused by your reasoning here. You’re describing how it was extremely difficult to you and you had to go to great lengths to learn technology. Not everyone did this back then not does everyone do it now.
That was the stuff you needed to do to do things like: play video games on your computer, get online and chat with people, hell even use your PC to write an essay and print it out (or even setup your printer to begin with). You didn’t necessarily have to go as far as they described to do that stuff, but you had to do some of it.
Nowadays there’s no equivalent. You don’t have to at least kinda understand the filesystem to play minecraft on your iPad.
I think you’re missing my point. I’m not saying everyone capable of using a computer today is equivalent to that level of dedication and curiosity. I’m just saying that in the same way only a handful of people do that today only a handful of people pursued it in the past. It’s become easier to use computers, but hat doesn’t mean there still aren’t people who learn the ins and outs of them today like people had to do in the past to use them.
I think you missed my point, which is that everyone who used a computer back then (less of the overall population) had to at least dip their toes in learning how they worked back then. This meant that a lot of people who otherwise wouldn’t have gotten interested in computers found out that they really liked them and started going down the rabbit hole on their own from there.
People would get a computer for school, buy a game for it, and have to learn how to fix the computer when the game didn’t work. Hell, in the 80s (as a 90’s kid I missed the boat on this one) there were magazines that included games where you had to type the code for the game (in BASIC) into the computer yourself for the game to run. Many of those magazine games had bugs so if you wanted to fix it you had to learn some basic coding skills to spot the bug/typo in the magazine.
Nowadays such entryways into the computing hobby are far less ubiquitous, you have to seek them out. I’m not saying people now are less capable or less curious or that the hardcore nerds won’t still get into the hobby, I’m saying it used to be required for playing PC games or social interactions online. There used to be incentives for people who otherwise wouldn’t have tried to figure out how computers worked, and as a result a lot of people who didn’t necessarily think of themselves as would-be computer people ended up getting into computers.
I totally agree, but nobody is saying everyone that uses a computer is a tech wiz. I’m saying there are still plenty of people today interested in digging in deeper just like there were back then. Tech being more accessible today hasn’t led to a decrease in such things, because the people the ease brought in wouldn’t have dug in in the past.
I really think it has led to a reduction in the rate at which otherwise average people wind up in tech because they like it, as opposed to for money.
In other words, the computing hobby has declined from its heyday in the 80’s-early 00s. Most people who build their own PC now can do so with about 10 minutes of help from YouTube and tools like PCPartsPicker, which helps a lot with accessibility, but the trade off is that the people who get into computers now don’t need to spend as much time on them to get into them. They don’t need to build up as much foundational knowledge, so now that knowledge has become rarer, even within e.g. the indie PC gaming hobby.
You can literally make an entire video game without writing any code. This is phenomenal if you want to make games easily, but it also gives coding a level of inaccessibility even in the minds of people getting into making video games that didn’t use to exist because they came as a package deal.
Gen Z/Alpha are the new boomers. I teach hundreds of so called intellectual cream of the crop per year. It was bad before the pandemic, it’s seriously concerning now. The youth has largely divorced from reasoning and are used to reason in simple inputs to simple output. I am genuinely scared.
No, it’s recognizing that tinkering means different things now.
In the 80s and 90s, if you were learning computers you had no choice but to understand how the physical machine worked and how software interacted with it. Understanding the operating system, and scripting was required for essentially any task that wasn’t in the narrow collection of tasks where there was commercial software. There was essentially one path (or a bunch of paths that were closely related to each other) for people interested in computers.
That just isn’t the case now. There are more options available and many (most?) of them are built on top of software that abstracts away the underlying complexity. Now, a person can use technology and never need to understand how it works. Smartphones are an excellent example of this. People learn to use iOS or Android without ever knowing how it works, they deal with the abstractions instead of the underlying bits that were used to create it.
For example, If you want to play games, you press a button in Steam and it installs. If you want to stream your gaming session to millions of people, you install OBS and enter your Twitch credentials. You don’t need to understand graphical pipelines, codecs, networking, load balancing, or worry about creating client-side applications for your users. Everything is already created for you.
There are more options available in technology and it is completely expected that people distribute themselves amongst those options.
I get that, I’m not saying differently, I’m just saying that it’s not like the only reason people were learning lower level things was only to play games. Some people were just curious about it. Plenty of people are still learning those things because they’re curious. The barrier to entry being lower doesn’t mean there are less people who are curious about learning! If anything, it means that people who are curious but thought the barrier to entry was higher in the past have an easier time getting into the hobby now.
Do you think that the Arduino project has been a net negative for people curious about learning low level microcontroller stuff? It was created out of frustration by people learning it wanting it to be easier to begin to learn. https://spectrum.ieee.org/the-making-of-arduino
Sure, sure, old man. Everything was better when you were young.
There never was a majority of people who were into computers. It was always a minority. And I’d argue that nowadays there’s more developers because there’s simply more people with access to computers.
Some of them won’t like them, some will be neutral and some will be “geeking around”.
And having seen some code from people both older and younger, the younger ones are better (note that it’s my anecdotal evidence). And you at least can train the younger ones, while the “experienced” will argue with you and take energy out of your day.
I’m so tired of the stupid “when I was young, everything was better”. You know what else was exactly the same? The previous generation telling you how everything was better when they were young. Congrats, you’re them now.
Sure, sure, old man. Everything was better when you were young.
I’m 28.
There never was a majority of people who were into computers. It was always a minority. And I’d argue that nowadays there’s more developers because there’s simply more people with access to computers.
I’ve literally said that the kind of access to computers matters. In my childhood it was Windows 2000 (98SE when I wasn’t intelligent or interested enough). In those greybeards’ childhoods - I guess a greybeard is someone who didn’t have a computer in their childhood, but with programmable calculators, or automatic devices (like sewing machines) manufactured then, it was easier to grasp the initial concepts.
Human brain is not a condom, it can’t just fit something as messy and big even to use as today’s desktop OS’es and general approaches and the Web. It will reject it and find other occupations. While in year 2005 the Web was more or less understandable, and desktop operating systems at least in UI\UX didn’t complicate matters too much.
Some of them won’t like them, some will be neutral and some will be “geeking around”.
But the proportion will change in just the way I’ve described.
And having seen some code from people both older and younger, the younger ones are better (note that it’s my anecdotal evidence). And you at least can train the younger ones, while the “experienced” will argue with you and take energy out of your day.
Maybe that’s because you are wrong and like people who bend under the pressure of your ignorance. Hypothetically, this is not an attack. Or maybe just those who don’t argue, that’s a social thing.
Also, of course, people whose experience has been formed in a different environment think differently, and their solutions might seem worse for someone preferring the current environment.
As you said, that’s anecdotal.
I’m so tired of the stupid “when I was young, everything was better”. You know what else was exactly the same? The previous generation telling you how everything was better when they were young. Congrats, you’re them now.
Well, this would mean you’re tired of your own mental masturbation because this is not what I said.
I’m talking more along the lines of everything coming to an end and this complexity growth being one of the mechanisms through which this industry will eventually crash. Analogous to, say, citizenship through service for Roman empire.
Grey-stubble Gen-X’er here… The 80s and (moreso for me) 90s were a great time to get into tech. Amiga, DOS, Win3.11, OS/2, Linux… BBS’s and the start of the Internet, accompanied by special interest groups and regular in-person social events.
Everyone was learning at the same time, and the complexity arrived in consumable chunks.
Nowadays, details are hidden behind touchscreens and custom UXs, and the complexity must seem insurmountable to many. I guess courses have more value now.
Basically everybody making a game for Amiga made the equivalent of their own graphics drivers. Programming direct to the specialized hardware, and M68000 assembly was so easy and intuitive it was a joy to use.
But that way of programming apps is completely obsolete today. Now it’s all about abstraction layers. And for a guy like me, it feels like I lost control.
If you want to program “old school” you have to play with things like Arduino.
I’m a relic now, that’s just how it is.
Yes it’s one of the most cheapest and amazing chips but also not very known about, or so I feel.
I made a little webserver on it that polled a site I had, so that I could switch (ok, only a led but still) on and off both from the esp and the website. Quite capable little chip.
Hah. I was just playing a YT video of modem sounds for my son, after showing him some “history” videos about early PCs, BBS’s, text adventure and early commodore* and PC gaming.
Normal, mainstream software expected users to run DOS commands and edit autoexec.bat/config.sys files, and installing new hardware often involved configuring motherboard DIP switches and trying to figure out what “IRQ” and “DMA” means. There is no equivalent to that today. Plug it in, turn it on, and you’re done. 9 times out of 10 you don’t even need to install a driver, your OS already has it. Where does the door to learning and discovery present itself? With plug and play systems and walled garden app stores, everywhere a user could possibly come across some more advanced concepts has been muted and decorated over with pretty conveniences. Computers are toasters now.
Eh, if you’re into computers, you’ll find your way. My first “programming” adventures were writing batch/vb scripts and putting them in the startup folder and watching the teacher lose their shit when when their computer turned off after five seconds. Or watching all of the classroom open and close the CD drives 50 times when we were the first to have an IT class that day.
and installing new hardware often involved configuring motherboard DIP switches and trying to figure out what “IRQ” and “DMA” means.
That part is about IBM PC architecture more than it is about computers in general, including personal computers of that time.
EDIT: I wonder, why all the downvotes, this is just true, look at Macs of that time. I’m not saying interrupts themselves are or a concept of DMA itself is.
I think it’s more a matter of the ideals of the times, Foss was created in the 80’s, as I see it as an ideological child of the 70’s, a period of time where progress, optimism and idealism about creating a better future and a better world probably peaked.
Of course there is also idealism today, but it’s different, at least the way I see it, the sense of quick progress especially on the humanitarian side is gone, the decades of peace with Russia is broken, and climate change hangs as a threatening cloud above us, and the rise of China creates turbulence in the world order.
So although things maybe weren’t actually better in the 80’s, there are definitely aspects that look very attractive in hindsight.
But as I see it, the mentality for FOSS is now stronger than ever, because aside from idealism, it’s proved itself to also be a pragmatically good choice in many situations. But all the original founders and enablers are of course old today.
And complaining about how “people today” use technology is stupid, because chances are we would have done the same had it been available to us when we were young.
I disagree with your idea of real world turbulence affecting it. Things were going the wrong way even in 2005. Dotcom bubble, Iraq war, those things - maybe.
I actually think that USSR’s breakup is what long-term caused how our world has become worse.
Say, in terms of computers and mass culture too, they sometimes treat the 90s as a result of that breakup, but that doesn’t quite make sense, despite a few armed conflicts, it was a gradual process, CIS as an organization was treated as almost a new union in making even in my childhood.
That breakup has released a lot of dirty money into the world, and through not the cleanest people in western countries, too.
And ideologically - the optimist version of the Cold War ending was some syncretic version of the “western” and the “eastern” promises for the space-faring united future. And much of the 90s was about, often dystopian, but fantasies in the context of such an utopia.
IRL both optimist promises were forgotten. Thus the current reality.
Wow you are way off time wise, I spoke of the 70’s and 80’s. Everything you mention is AFTER that.
The Foss idea is early 80’s and EFF was created in the mid 80’s, and as I mentioned, based on the ideology of the 70’s.
The turnaround was after Carter when Reagan was elected, not just in USA, but also in most of Europe.
I actually think that USSR’s breakup is what long-term caused how our world has become worse.
I agree, but initially it was all cool, a lot of Europe achieved freedom and democracy, and the Soviet states turned to democracy. We even had cooperation between the West and Russia initially. Unfortunately Putin completely ruined that after he came to power in 1991, which is also around the time Linux started.
Wow you are way off time wise, I spoke of the 70’s and 80’s. Everything you mention is AFTER that.
I meant the “peace with Russia” part by that, sorry.
The Foss idea is early 80’s and EFF was created in the mid 80’s, and as I mentioned, based on the ideology of the 70’s.
Meant that exactly, that (in my perception) there’s something similar in that ideology with science fiction of the same time, cinema, electronic music, industrial design and general techno-optimism. Some kind of universalism, like in Asimov’s Foundation.
Unfortunately Putin completely ruined that after he came to power in 1991, which is also around the time Linux started.
1999, 1991 is Yeltsin, but one is a logical continuation of the other (many Russian liberals disagree, love Yeltsin and hate Putin, don’t listen to them).
The turnaround was after Carter when Reagan was elected, not just in USA, but also in most of Europe.
You are absolutely right, my bad on that one. But actually under Yeltsin there was still room for optimism, and in those years cooperation between the west and Russia increased.
Given the person said they’re 28, I’m actually older. And I decided to not be a dick about it and to not pretend that everything was better when I was young. Everything was different, sure. Some things were better, some were not. But I decided to not do the whole “back in my days” thing because I always found it stupid and luckily that didn’t change with age.
What’s next, girls vs boys code? People wearing hats vs people not wearing hats code?
Manager material right there.
BTW if an old geek argues that your code design/decision is bad then you should probably listen. But that’s what beginners don’t do, they think they know it all…
I think this is also a problem of old timers not being able to articulate their concerns well. There is probably a reason they do or don’t do something a certain way, but if they can’t explain why, then no one is going to listen. Blindly following someone for percieved wisdom doesn’t teach you anything.
I actually like it when someone can show me why I’m wrong, because it saves me time. But if you can’t tell me WHY my idea won’t work, I’m probably just gunna do it anyway to figure it out myself.
I think this is as much a case of bad teachers as it is bad students.
Glad you can read and repeat stuff! I presented it as such to avoid wannabe smartasses, guess they still arrived. Since we’ve touched on the subject of managers and hiring, do you often hear the phrase “not a cultural fit”? Wouldn’t surprise me.
If an old geek argues with a senior architect about architecture, I kinda think the architect is the one who’s right in 99% of cases.
Because the replacement comes from non-graybeards in FOSS, and their replacement from without-beards in FOSS, and they come from youths in FOSS, and they from teens geeking around with computers, and oops - teens are not geeking around with computers, they are watching reels and scrolling recommendations and doing other bullshit. If they have a PC, it’s an unloved work tool for them, with crappy bloated Windows, crappy bloated software for work and studies, not always crappy, but bloated games, you get the idea.
Because there was a generation very fertile on geeks. It’s going away. There are demographic pits and there are demographic, what they call them, hills? The point is, we are seeing the effects of the latter.
“Youth bad.” Lazy take. As if everyone in the gray beard generation was tinkering around with computers? Plenty of youths still tinker. Posting condescending shit like this is just going to turn them off from pursuing/contributing.
It’s not youth bad. It’s that in the 80s and 90s, computers were fun and required a lot of tinkering. Nowadays they mostly work. They’re boring.
People who tinkered learned stuff. Users just know how to use a couple applications.
I don’t see anything that could be considered a “Youth bad” statement in that comment. It’s a complex issue, influenced by a myriad of factors.
For example, I could dissasemle and reassemble my first PC without any prior knowledge. I had to learn to use DOS to navigate the OS and get things done. I got a book from the library about it, and spent hours upon hours just learning about how the file structure, commands, programs, external media, etc. worked before I could do anything remotely useful.
Today a PC/tablet/phone is a black box, you have to actively WANT to tinker in order to learn such about how they work. And most big tech companies try to punish you for so much as trying to replace a battery yourself.
I suspect you are projecting some personal feelings onto a stranger’s comment.
They aren’t just a black box, tinkering with them actually has negative consequences. On Android getting root access results in Safetynet attestation failing and on iOS you can’t even get root at all unless you are happy to run some bins from questionable sources. Things are very different for youths. As someone else stated to tinker in the way you mentioned means getting an Arduino or some kind of tinker-friendly SBC.
I’m really confused by your reasoning here. You’re describing how it was extremely difficult to you and you had to go to great lengths to learn technology. Not everyone did this back then not does everyone do it now.
That was the stuff you needed to do to do things like: play video games on your computer, get online and chat with people, hell even use your PC to write an essay and print it out (or even setup your printer to begin with). You didn’t necessarily have to go as far as they described to do that stuff, but you had to do some of it.
Nowadays there’s no equivalent. You don’t have to at least kinda understand the filesystem to play minecraft on your iPad.
I think you’re missing my point. I’m not saying everyone capable of using a computer today is equivalent to that level of dedication and curiosity. I’m just saying that in the same way only a handful of people do that today only a handful of people pursued it in the past. It’s become easier to use computers, but hat doesn’t mean there still aren’t people who learn the ins and outs of them today like people had to do in the past to use them.
I think you missed my point, which is that everyone who used a computer back then (less of the overall population) had to at least dip their toes in learning how they worked back then. This meant that a lot of people who otherwise wouldn’t have gotten interested in computers found out that they really liked them and started going down the rabbit hole on their own from there.
People would get a computer for school, buy a game for it, and have to learn how to fix the computer when the game didn’t work. Hell, in the 80s (as a 90’s kid I missed the boat on this one) there were magazines that included games where you had to type the code for the game (in BASIC) into the computer yourself for the game to run. Many of those magazine games had bugs so if you wanted to fix it you had to learn some basic coding skills to spot the bug/typo in the magazine.
Nowadays such entryways into the computing hobby are far less ubiquitous, you have to seek them out. I’m not saying people now are less capable or less curious or that the hardcore nerds won’t still get into the hobby, I’m saying it used to be required for playing PC games or social interactions online. There used to be incentives for people who otherwise wouldn’t have tried to figure out how computers worked, and as a result a lot of people who didn’t necessarily think of themselves as would-be computer people ended up getting into computers.
I totally agree, but nobody is saying everyone that uses a computer is a tech wiz. I’m saying there are still plenty of people today interested in digging in deeper just like there were back then. Tech being more accessible today hasn’t led to a decrease in such things, because the people the ease brought in wouldn’t have dug in in the past.
I really think it has led to a reduction in the rate at which otherwise average people wind up in tech because they like it, as opposed to for money.
In other words, the computing hobby has declined from its heyday in the 80’s-early 00s. Most people who build their own PC now can do so with about 10 minutes of help from YouTube and tools like PCPartsPicker, which helps a lot with accessibility, but the trade off is that the people who get into computers now don’t need to spend as much time on them to get into them. They don’t need to build up as much foundational knowledge, so now that knowledge has become rarer, even within e.g. the indie PC gaming hobby.
You can literally make an entire video game without writing any code. This is phenomenal if you want to make games easily, but it also gives coding a level of inaccessibility even in the minds of people getting into making video games that didn’t use to exist because they came as a package deal.
Gen Z/Alpha are the new boomers. I teach hundreds of so called intellectual cream of the crop per year. It was bad before the pandemic, it’s seriously concerning now. The youth has largely divorced from reasoning and are used to reason in simple inputs to simple output. I am genuinely scared.
No, it’s recognizing that tinkering means different things now.
In the 80s and 90s, if you were learning computers you had no choice but to understand how the physical machine worked and how software interacted with it. Understanding the operating system, and scripting was required for essentially any task that wasn’t in the narrow collection of tasks where there was commercial software. There was essentially one path (or a bunch of paths that were closely related to each other) for people interested in computers.
That just isn’t the case now. There are more options available and many (most?) of them are built on top of software that abstracts away the underlying complexity. Now, a person can use technology and never need to understand how it works. Smartphones are an excellent example of this. People learn to use iOS or Android without ever knowing how it works, they deal with the abstractions instead of the underlying bits that were used to create it.
For example, If you want to play games, you press a button in Steam and it installs. If you want to stream your gaming session to millions of people, you install OBS and enter your Twitch credentials. You don’t need to understand graphical pipelines, codecs, networking, load balancing, or worry about creating client-side applications for your users. Everything is already created for you.
There are more options available in technology and it is completely expected that people distribute themselves amongst those options.
config.sys generation represent.
We got extended memory now! Bill gates doesn’t know what he’s talking about.
I get that, I’m not saying differently, I’m just saying that it’s not like the only reason people were learning lower level things was only to play games. Some people were just curious about it. Plenty of people are still learning those things because they’re curious. The barrier to entry being lower doesn’t mean there are less people who are curious about learning! If anything, it means that people who are curious but thought the barrier to entry was higher in the past have an easier time getting into the hobby now.
Do you think that the Arduino project has been a net negative for people curious about learning low level microcontroller stuff? It was created out of frustration by people learning it wanting it to be easier to begin to learn. https://spectrum.ieee.org/the-making-of-arduino
Sure, sure, old man. Everything was better when you were young.
There never was a majority of people who were into computers. It was always a minority. And I’d argue that nowadays there’s more developers because there’s simply more people with access to computers.
Some of them won’t like them, some will be neutral and some will be “geeking around”.
And having seen some code from people both older and younger, the younger ones are better (note that it’s my anecdotal evidence). And you at least can train the younger ones, while the “experienced” will argue with you and take energy out of your day.
I’m so tired of the stupid “when I was young, everything was better”. You know what else was exactly the same? The previous generation telling you how everything was better when they were young. Congrats, you’re them now.
I’m 28.
I’ve literally said that the kind of access to computers matters. In my childhood it was Windows 2000 (98SE when I wasn’t intelligent or interested enough). In those greybeards’ childhoods - I guess a greybeard is someone who didn’t have a computer in their childhood, but with programmable calculators, or automatic devices (like sewing machines) manufactured then, it was easier to grasp the initial concepts.
Human brain is not a condom, it can’t just fit something as messy and big even to use as today’s desktop OS’es and general approaches and the Web. It will reject it and find other occupations. While in year 2005 the Web was more or less understandable, and desktop operating systems at least in UI\UX didn’t complicate matters too much.
But the proportion will change in just the way I’ve described.
Maybe that’s because you are wrong and like people who bend under the pressure of your ignorance. Hypothetically, this is not an attack. Or maybe just those who don’t argue, that’s a social thing.
Also, of course, people whose experience has been formed in a different environment think differently, and their solutions might seem worse for someone preferring the current environment.
As you said, that’s anecdotal.
Well, this would mean you’re tired of your own mental masturbation because this is not what I said.
I’m talking more along the lines of everything coming to an end and this complexity growth being one of the mechanisms through which this industry will eventually crash. Analogous to, say, citizenship through service for Roman empire.
Grey-stubble Gen-X’er here… The 80s and (moreso for me) 90s were a great time to get into tech. Amiga, DOS, Win3.11, OS/2, Linux… BBS’s and the start of the Internet, accompanied by special interest groups and regular in-person social events.
Everyone was learning at the same time, and the complexity arrived in consumable chunks.
Nowadays, details are hidden behind touchscreens and custom UXs, and the complexity must seem insurmountable to many. I guess courses have more value now.
Basically everybody making a game for Amiga made the equivalent of their own graphics drivers. Programming direct to the specialized hardware, and M68000 assembly was so easy and intuitive it was a joy to use.
But that way of programming apps is completely obsolete today. Now it’s all about abstraction layers. And for a guy like me, it feels like I lost control.
If you want to program “old school” you have to play with things like Arduino.
I’m a relic now, that’s just how it is.
Time to program the ESP8266 :-)
My wife actually used that for something she needed to be able to remote control a few years back. She tells me it an amazing chip. 😀
Wow cool!
Yes it’s one of the most cheapest and amazing chips but also not very known about, or so I feel.
I made a little webserver on it that polled a site I had, so that I could switch (ok, only a led but still) on and off both from the esp and the website. Quite capable little chip.
Me, thinking about the days of dial up: 😭
Dzzzz rrrrr bidibidibippbip KRRRRRRRRRRRRRRR…
Hah. I was just playing a YT video of modem sounds for my son, after showing him some “history” videos about early PCs, BBS’s, text adventure and early commodore* and PC gaming.
History? I lived it, son.
Normal, mainstream software expected users to run DOS commands and edit autoexec.bat/config.sys files, and installing new hardware often involved configuring motherboard DIP switches and trying to figure out what “IRQ” and “DMA” means. There is no equivalent to that today. Plug it in, turn it on, and you’re done. 9 times out of 10 you don’t even need to install a driver, your OS already has it. Where does the door to learning and discovery present itself? With plug and play systems and walled garden app stores, everywhere a user could possibly come across some more advanced concepts has been muted and decorated over with pretty conveniences. Computers are toasters now.
Eh, if you’re into computers, you’ll find your way. My first “programming” adventures were writing batch/vb scripts and putting them in the startup folder and watching the teacher lose their shit when when their computer turned off after five seconds. Or watching all of the classroom open and close the CD drives 50 times when we were the first to have an IT class that day.
That part is about IBM PC architecture more than it is about computers in general, including personal computers of that time.
EDIT: I wonder, why all the downvotes, this is just true, look at Macs of that time. I’m not saying interrupts themselves are or a concept of DMA itself is.
I think it’s more a matter of the ideals of the times, Foss was created in the 80’s, as I see it as an ideological child of the 70’s, a period of time where progress, optimism and idealism about creating a better future and a better world probably peaked.
Of course there is also idealism today, but it’s different, at least the way I see it, the sense of quick progress especially on the humanitarian side is gone, the decades of peace with Russia is broken, and climate change hangs as a threatening cloud above us, and the rise of China creates turbulence in the world order.
So although things maybe weren’t actually better in the 80’s, there are definitely aspects that look very attractive in hindsight.
But as I see it, the mentality for FOSS is now stronger than ever, because aside from idealism, it’s proved itself to also be a pragmatically good choice in many situations. But all the original founders and enablers are of course old today.
And complaining about how “people today” use technology is stupid, because chances are we would have done the same had it been available to us when we were young.
I disagree with your idea of real world turbulence affecting it. Things were going the wrong way even in 2005. Dotcom bubble, Iraq war, those things - maybe.
I actually think that USSR’s breakup is what long-term caused how our world has become worse.
Say, in terms of computers and mass culture too, they sometimes treat the 90s as a result of that breakup, but that doesn’t quite make sense, despite a few armed conflicts, it was a gradual process, CIS as an organization was treated as almost a new union in making even in my childhood.
That breakup has released a lot of dirty money into the world, and through not the cleanest people in western countries, too.
And ideologically - the optimist version of the Cold War ending was some syncretic version of the “western” and the “eastern” promises for the space-faring united future. And much of the 90s was about, often dystopian, but fantasies in the context of such an utopia.
IRL both optimist promises were forgotten. Thus the current reality.
Wow you are way off time wise, I spoke of the 70’s and 80’s. Everything you mention is AFTER that.
The Foss idea is early 80’s and EFF was created in the mid 80’s, and as I mentioned, based on the ideology of the 70’s.
The turnaround was after Carter when Reagan was elected, not just in USA, but also in most of Europe.
I agree, but initially it was all cool, a lot of Europe achieved freedom and democracy, and the Soviet states turned to democracy. We even had cooperation between the West and Russia initially. Unfortunately Putin completely ruined that after he came to power in 1991, which is also around the time Linux started.
I meant the “peace with Russia” part by that, sorry.
Meant that exactly, that (in my perception) there’s something similar in that ideology with science fiction of the same time, cinema, electronic music, industrial design and general techno-optimism. Some kind of universalism, like in Asimov’s Foundation.
1999, 1991 is Yeltsin, but one is a logical continuation of the other (many Russian liberals disagree, love Yeltsin and hate Putin, don’t listen to them).
Perhaps ; here I’m too ignorant.
You are absolutely right, my bad on that one. But actually under Yeltsin there was still room for optimism, and in those years cooperation between the west and Russia increased.
And you’ll be older tomorrow. There is no escape. Being a dick about it won’t help.
Given the person said they’re 28, I’m actually older. And I decided to not be a dick about it and to not pretend that everything was better when I was young. Everything was different, sure. Some things were better, some were not. But I decided to not do the whole “back in my days” thing because I always found it stupid and luckily that didn’t change with age.
Lol yeah that was some anecdotal evidence!
What’s next, girls vs boys code? People wearing hats vs people not wearing hats code?
Manager material right there.
BTW if an old geek argues that your code design/decision is bad then you should probably listen. But that’s what beginners don’t do, they think they know it all…
I think this is also a problem of old timers not being able to articulate their concerns well. There is probably a reason they do or don’t do something a certain way, but if they can’t explain why, then no one is going to listen. Blindly following someone for percieved wisdom doesn’t teach you anything.
I actually like it when someone can show me why I’m wrong, because it saves me time. But if you can’t tell me WHY my idea won’t work, I’m probably just gunna do it anyway to figure it out myself.
I think this is as much a case of bad teachers as it is bad students.
Yes totally. I mean there are the same people just with an age/experience gap.
Glad you can read and repeat stuff! I presented it as such to avoid wannabe smartasses, guess they still arrived. Since we’ve touched on the subject of managers and hiring, do you often hear the phrase “not a cultural fit”? Wouldn’t surprise me.
If an old geek argues with a senior architect about architecture, I kinda think the architect is the one who’s right in 99% of cases.
That’s a looooot of assumptions and stale assumptions at that.