I recall, some years ago, designing a controller
Yeah, me too!
In early 1970s I was tasked with designing a controller for a raster-scanned X-Y positioning stage. This was part of a new laser window-test facility that we had been asked to spin up for the Air Force. I didn't know it at the time, but this was the beginning of our involvement in the so-called Star Wars Ballistic Missile Defense Initiative or BMDI. That program lasted for several years (it is still ongoing) but yielded no reliable, battle-ready hardware until decades later, long after my involvement.
Although discrete-logic integrated circuits were available, especially TTL (transistor-transistor-logic) ICs, I decided to implement the controller using diode-relay logic because we had a short window of opportunity. This turned out to be a very robust solution that was quickly implemented with commercial off-the-shelf components. It worked perfectly the first time it was turned on. It has probably long since been scrapped because we built another one bigger and better. If asked to do it again today, I would surely use a microcontroller instead of diode-relay logic. The state of the art in both processors and programming for embedded systems has advanced a LOT in fifty years. I still prefer assembly language programming versus high-level abstract coding methods because it allows me to be as close as possible to manipulating actual hardware, banging on the bits so to speak.
@danadak has opened my eyes to graphical programming, which is another step closer to actually speaking to the computer to tell it what to do. Still, there is a long way to go because the graphical programming paradigm still generates high-level "C" or "C++" code that must then be translated into executable machine code... the ones and zeroes that machine logic "understands."
This being the early part of the 21st century, I can understand the reluctance of earlier generations (Boomers, X, Millenniums, etc.) to embrace unfamiliar paradigms. But it really is better to learn the new while still appreciating and cherishing the old... IMHO. Those who forget the mistakes of the past are doomed to repeat those mistakes in the present.
Word got around quickly. A few weeks later a team of high-level Air Force personnel visited our test facility and were suitably impressed. We were then awarded a contract to build a full-scale laboratory, to be used to test laser windows of weapons-grade size... at the Air Force Weapons Laboratory in Albuquerque, New Mexico. I didn't participate much in that effort initially, but instead worked to establish a smaller version of the laser window test facility on campus. Those were indeed fun times.
Meanwhile, technology sped ahead and microprocessors were invented. I got involved with using the Intel 8080 µP as an embedded controller for scientific instruments, later "moving up" to the Intel 8085 which was easier to use. The last time I implemented an 8085 system was in the 1980s when tasked with upgrading a tourist exhibit of the Apollo mission control room for NASA. During that same period, microprocessors became embedded in personal computers, but acceptance was slow during the early years of the 1980s. That all changed with the introduction of software-based (instead of paper) spread-sheets on PCs. IBM and Microsoft were alleged to have a motto for their operating systems' software development: it ain't done 'til Lotus won't run! But open-source "free" software put the damper on that.
So, fast forward to June 2012. I turned 68 that month and was now only working part-time. Somehow I found Electronics Point and met
@KrisBlueNZ (deceased) who was trying to help
@TenderTendon use a mosfet switch, controlled by a PIC microprocessor, to operate a custom-machined, high-brightness, five ampere LED flashlight. You can look up
this thread to see how that worked out. Jeff hasn't abandoned that project, but he is taking it in another direction that I will help him with.
So, I stand by my statement: "integrated circuit processors are not sledge hammers, and they are really "dirt cheap" solutions." Yes, you do need certain inexpensive tools (anyone still use a multimeter?) and there is a learning curve, but once you have dipped your toes into those new waters, you will find the swimming is a lot of fun. The solutions that
@danadak is offering are almost free, with a few bux needed to buy a Arduino Nano that can be used for programming other micros (such as the ATTiny) using free software that can be downloaded from the Internet. You do need a personal computer, but most of us here have at least one of those, even if it's in the form of a "smart" cell phone.