X11 Must DIE

Unix, C, computing like it's 1980.

Month: December, 2013

old-school games and modern kids

mudding

Kids these days don’t know what they’re missing.

Being in the USA, and yesterday being December 25th, I’m reminded of all of the wonderful things that people are willing to spend money on for their kids.  I have kids, and while I’m all about them enjoying fun technology, I’m also not one to go out and buy them a new iPhone just so that they can play some cheesy java games on it.

My desire to revive some of what I personally enjoyed when I was younger started when my oldest daughter finally decided that she wanted a “laptop” because she, being 9 years-old, was obviously going to need to use one to file her taxes, create banking spreadsheets, and be able to research to help her with her own software development (read with heavy sarcasm.)  Anyhow, I did end up turning over some of that “re-purposed hardware” in the form of an eMachines laptop over to her.  I handed her a Crunchbang DVD and a LinuxBBQ Coal CD, pulled the “How to install BBQ” wiki page up on my laptop.  And then wished her the best of luck.  I figure if she’s old enough to have a laptop, she’s old enough to take responsibility for installing and administering her own system.  It took her most of the day, but she eventually installed both of them on different partitions, and then began to ask questions about how to turn off conky, change UI settings, etc.

As you’d expect, it wasn’t long before she was asking “So, how do I get some games on here?”  We’d covered how remote repositories worked, so I suggested she pull the Debian bsdgames package to check it out.  Rather than what I’d expected….”Urgh, Dad.  These games are all reading and suck and I hate them and there’s no graphics and I won’t ever be popular in school with you as my father”, she actually got into some of them.  She started off by hating them, and then realized that with a bit of imagination, they were still fun.

This was all about a year ago, and while she doesn’t really play games on her system anymore, opting to make stop-motion animation instead (which is something that she’s really interested in), I can say that it helped me to realize that we often underestimate what children are capable of doing and enjoying.  I ended up getting her the book “Snake Wrangling for Kids” which is a python programming book targeted for children, and she has managed to write a couple of useful scripts for herself.

I guess that the whole experience involved me learning as much as she did.  Whereas she learned what grub does, I learned that if she really wanted to succeed at something, she was capable.  I learned that just because the technology world of today spends so much time talking about how they’ve created hyper-realism in game graphics, that it doesn’t mean that older text-based games aren’t still fun.  It is easy to forget just how I felt the first time I got “Gro-bot” to run on my TandyTRS80 color computer, and that part of the enjoyment in those days was the amount of effort involved in even making the most basic of games.

I’m not sure that without some amount of challenge that we can appreciate the joy in overcoming it.

Advertisements

We never know anything until we test it pt. 2

bashtest2

 

In the previous post, I’d looked at the difference in run speed for a case-based and if/elif/else-based bash script.  Some of the comments (thanks to kooru and CorkyAgain for the input) led me to believe that we may have just scratched the surface of testing.

Any good science starts with a control group.  For this example I set up the for loop as was done in both of the previous tests, but totally removed any of the comparison logic and simply set the variable to a constant and continued the echo as was done previously.  After a few runs, it came in around 0.319 seconds on average on the test system.  This number will be henceforth referred to as my “base value” for the scripts.  We also need a reasonable way to see in the entirety of the script, what percentage of total run time does the comparison take.  For this, I’m thinking that we can make up a “efficiency factor” and call it something like EF, because I don’t think that there is a standard to maintain.  I am determining the made-up EF by taking the (“mean run-time” – “base value”) / “base value” * 100.  It seems like that should give us a pretty decent idea of what percentage of total time the comparison itself is taking.

Using this system our EF’s for yesterday’s test would come out to be:  10.66 for the case statement, and 83.39 for if/elif/else.  This is a much more significant number since it should (in theory), be much more accurate targeting of only the changes in evaluations.

kooru suggested that switching from the normal “[” or “test” to the bash inclusion “[[” would show a significant increase in evaluation speed.  It does.  Of 5 runs, the mean run-time came in at close to 0.425 for the double-bracketed compares.  This means that by changing only this particular little detail, we dropped the EF of if/elif/else from 83.39 to 33.23.  That’s honestly a huge improvement, more than doubling the efficiency of what is almost identical logic.

I’m about to have to eat my words about how I’d expected case to behave in in comparisons with the matches being closer to the top of the list vs how if/elif/else does.  In both situations, it did make a difference.  I set up both a double-bracketed if/elif/else and a case to find the match in the first comparison every time.  This is what CorkyAgain had suggested for optimizing scripts where you know to expect certain situations more often than others.  I did use the double-bracketed compare to give our “optimized” script the benefit of our previous findings.  In the case where there are 4 comparisons, but the match is made at the first one, if/elif/else’s EF drops to 25.39.  In the exact same situation with case-switch, the EF goes down to 10.34 after figuring a mean.  So, this is where I hang my head in shame and admit that the difference in case-switch for the match to be found closer to the top is almost insignificant, while in if/elif/else, it makes a much more noticeable difference.

In summary, using our totally made-up for this test EF factors (smaller is better for those of you who are just catching on): 

case-switch with unknown priority: EF=10.66
case-switch with stacked priority: EF=10.33
if/elif/else with unknown priority and "[": EF=83.39
if/elif/else with unknown priority and "[[": EF=33.23
if/elif/else with stacked priority and "[[": EF=25.39

I believe that we can start making generalizations about optimizing scripts now.  In any situation, case-switch does outperform spaghetti-styled if/elif/else.  If you’re writing scripts where only a few comparisons are being done, the difference is probably negligible.  Assuming that it’s a bash script, double-bracketing should be used wherever possible over single-bracketing.  In all situations, if you know that certain comparisons are more probably than others, the choices should be moved to the top of the stack as long as it doesn’t interfere with the functionality of the script itself.

We never know anything until we test it

bashtest

Feel free to look carefully at this one.

There was recently a short discussion regarding a bash script that had been written, and how it could be made better than it currently was.  One of the big suggestions was the replacement of spaghetti-styled “if, elif, else” statements with “case” switches.  While I totally agree with this, I also realized that although I could write a good deal on how optimization through a compiler was benefited by using case-switch, I wasn’t 100% sure what the difference would be using an interpreted language.  Since this was about bash, it only seemed right to set up some kind of test in bash.

So, I created two scripts that are essentially identical, save the fact that they are using either the “if/elif/else” syntax or the “case” syntax inside of a for loop.  The for loop should increment enough times that we can see a difference…so let’s go with 10,000 iterations.  Since I want to allow for the same functionality in both of them, we’ll set up a second variable that is incremented manually in the exact same way (by direct assignment), but is value-checked using the two differing methods.

It stands to reason that due to breaking out of the loop using the case-switch, that we’d see significant differences from reading every single check using “if/elif/else”.  I considered checking to see if only the “else” or “*)” default values at the end of each for iteration would make a difference, but that’s not a practical use.  To be able to generalize, it makes sense to use differing levels within the conditional construct at different iterations.

It turns out that under MY test (this could certainly result in different results on other systems, and certainly with other scripts) that the case-switch script ran almost twice as fast as the if/elif/else one.  I tested it quite a few times using “time /path/to/script” and made certain to screenshot at least a few of the results.  The switch script ran consistently in the 0.35 second range, while the “if/elif/else” script ran between 0.56 and 0.58 second range.  It wasn’t quite 2x the speed, but it was certainly close enough to warrant consideration when writing your own scripts.

I am always wary to post anything that I do in bash, as I’m not a bash-scripting artist.  If you believe that I could have made these tests better, should have done them differently, or have any additional input…feel free to share them in the comments.  I’m always up for being able to see testable results, and so if there’s some way I could have made the test more accurate to the individual differences, please let me know.

DebianJoe’s Book of the Month Club

cover

 

If Oprah can have a book of the month club…

There are very few books written as long ago (mid 1980’s) regarding computer science that still carry the relevance that SICP does.  While thinking about writing this post, I thought I’d check out what the general consensus on what I’d consider essential reading for almost all computer programmers, and I actually laughed at the collection of reviews.  People either love or hate it, and I think that this fact deserves some discussion.

I know enough lisp to get myself into REPL trouble.  Even then, I tend to use elisp or common-lisp far more often than scheme, so why would I suggest a book written about a lisp dialect that I don’t regularly use?  To put it as simply as possible, SICP is far more about the way to think than the specific language used in the examples.  It’s like reading jazz guitar theory, and as such has significant carry-over to any language that you may attempt to learn in the future.  Also, in much the same way, every time that I pick it up and look through it again, I pick up something new.

It’s not light reading designed to read during a coffee break, and I tend to break out gnu-guile in emacs while reading it, just to play with some of the examples.  If you’re just starting out writing some “Hello World” programs, it may be a bit on the heavy side to follow, although you could certainly supplement it with extensive research after a few paragraphs to help solidify the ideas.  If you didn’t burn totally out, you’d certainly learn a great deal.

Essentially, the concepts being presented are about the way to use simple procedures to perform complex procedures in a systematic way.  If you begin to really think about how all computer programming is the complex manipulation of a whole lot of tiny relays, then I’m sure that the value of such a concept should become apparent.  There are in-depth sections on ways to deconstruct complex programs into smaller and more digestible fragments.

Also, MIT’s really great to offer the book in HTML for free, should you be so inclined to read it at http://mitpress.mit.edu/sicp/full-text/book/book.html.  There are a great many resources out there for the hobbyist programmer that are probably far easier to decompose, but for the dedicated enthusiast or serious programmer, this one is certainly worth the effort to check out.

Repurposing old hardware.

2013-12-12-051206_1280x960_scrot

 

Fun little project.

After the huge discussion on how I’m a thinkpad fanatic, one of the guys who I work with brought me a t43 in a cardboard box!  He says “I think something may be wrong with the mobo, but you can have it for spare parts if you need it.”

The first thing I did was slap a HDD in it and try to boot it up.  It posted some beep codes, and then I realized that I could faintly see the outline of some text (ala “out backlight”).  It was complaining about the hard disk, but recognized it and allowed me to bypass the update for firmware and continued to boot to grub.  This HDD had Debian Sid with cwm on it already, and I figured I’d use it for the testing.  I plugged an external monitor into the vga port and I was pleasantly surprised to see getty awaiting a log in.

Okay, no big deal.  I’ve got an extra backlight.  So, I took and swapped monitors totally out, but was presented with the exact same issue.  That ruled out backlight, power invertor, etc.  So I swapped out the power-supply for the monitor.  Nothing changed.  At this point, I tried the external monitor again.  Still worked fine.  So, lacking the knowhow to remove the on-board graphics adapter, I started thinking “What can I do with a working laptop base, hooked up to an external monitor?”  I already have a few desktops.  A have more laptops than is probably normal.  And then it hit me.  My kids would probably really like a SNES, NES, Game boy system that they didn’t have to blow on cartridges from 1988 to make work.

I set up the system, modified a SNES controller with an internal chip to covert it to a USB joystick, and then set up the little t43 base with mednafen and some roms of games that I owned.  After setting up the controller, I played Super Metroid for about 30 minutes before I remembered that I was supposed to be working on a fun project…not playing Super Metroid.  The problem is that mednafen, while being an amazing emulator for multiple systems, is launched from command line with the path to the ROM you want to open.  This is a bit complicated for a 5-year-old.

So, I hacked out a little bash script that simply lets them use the number associated with the game that it will launch via case-switch.  I tested it out, thoroughly with Mike Tyson’s Punch Out, and then turned it over to them to play with.  I modded up the .cwmrc to actually contain menu items so that they could easy launch my script, and let them go nuts.

After all was said and done, I ended up with a pretty cool little system that was a good replacement for some really old hardware that everyone still loves.  Considering that I had all of the parts sitting around except for the base that was given to me, it cost me exactly “nothing” to build other than some time.  Everyone loves it, especially for getting a few of the older game-boy games off of the very sketchy, ancient game-boy that I let them play with and onto something that actually worked well.

I love repurposing old hardware in ways that it can be appreciated once again.  Anytime it ends up with me getting to play Super-Metroid, that’s just a double win.

Hardware isn’t as hard as it used to be…

ibm-keyboard

This is a short rant, but it’s one of those things fresh on my mind.  Today, a good friend of mine that I’ll call “openSUSE guy” had some issues with his laptop.  He has been using an Asus laptop with a core i7, 16GB RAM, and ATI video.  It probably goes without saying, he sometimes pokes fun at my choice of hardware, as I often have HDDs with similar specs to his RAM.

Anyhow, this morning openSUSE-guy’s computer wouldn’t boot as in “won’t post, fan spins up and stops, and no flashes on monitor” wouldn’t boot.  I asked if he’d given it a cup of coffee, because I dumped a whole cup on the last laptop I had to replace, and they don’t like coffee as much as you might hope they would.  No such fatal mistake, as it had shut-down properly yesterday, and today…nothing.

While hardware failures do occur, and there’s really nothing that can be done in some cases, this isn’t about the massive failure.  openSUSE-guy took the keyboard off to check some voltages across the motherboard, and when he did…the keyboard started flopping around like one of those ribbons that the little Romanian gymnasts use in an Olympic sport that I don’t understand.  It was extremely flexible, and had a molded plastic overlay to separate the Chiclet-keys from each other.

I couldn’t help myself.  “That’s what’s wrong with it.  It’s made to be thrown away.”  I went to a filing cabinet and pulled out an old t43 keyboard (which is where everyone keeps spare laptop parts…right?) and tapped it against one of my coworker’s desk.  “Back in the IBM days, laptop keyboards had a solid metal backing which is way more masculine than that neon-backlit piece of junk.”

Really, the new laptops aren’t made to be worked on.  You’re supposed to use them up and then throw them away, and this just feels wrong to me.  I had much rather buy a few pieces to replace defective parts for about ten bucks on my older laptops than to drop $1000+ USD on a new laptop.  I’m cheap like that.  Once upon a time, things were intended to be used and reused, and I have noticed that it seems that products are being made at lower quality than their predecessors.  This goes for laptops, cell phones, and even vehicles.  Perhaps it’s a sign that I’m getting old, but I keep thinking “In my day, things were made better.”

After this rant, though, another guy I work with pulled an old Thinkpad 755CD out of hiding…and I realized THAT is a real laptop.  You could beat someone to death with those things, and they’d probably still boot up.

When Life Reflects Art

killxsolar

Due to horrible icy weather conditions these past few days, I’ve been either at work…or stuck in the house.  While this would be a great time to have been working on my installer (anyone else seen an entire HDD corrupted due to a dirty bit on hard shut-down from btrfs?), I’ve been trying to convince my wife that our home needs less “stuff.”

This all started a few days ago when one of the guys I work with was joking about a girl from another department bringing her laptop over for him to work on.  From what I gather from his rant was that she was using MS-Windows and had added a few “bonus” toolbars to her web-browser.  He was laughing about how much space they were taking up, and that it was similar to looking though a slit to actually try to read anything.  I responded with my own rant about how most developers were trying to steal my vertical screen space without any additional toolbars, which due to the fact that laptop monitors keep getting shorter and wider, makes no sense.

Of course, he then responded by asking what I use.  So, I pointed out that since his main work system was openSUSE, that he could replace the monstrosity that is KDE with something like cwm or dwm, and use BBQRoaster (I am biased) or suckless’s “Surf” browser to get that screen real-estate back while browsing.  Which led to some jokes about me being OCD, which I denied by mentioning that it isn’t obsessive to be frugal with useless decorations.

This is a long way to get around to the discussion I later had with my wife regarding Newsbeuter being my preferred method of getting RSS feeds, and that I really like simple and direct methods for getting whatever the task may be completed.  As we started talking about how less distractions make the world easier to navigate, somehow it ended up with me totally rearranging my house and starting to dispose of things that don’t get regular use.  I’m totally okay with this, as I really only have a very limited amount of input as to what furniture we have, where it goes, etc.  The house is her domain, and my big, blue chair and desk are really the only things that I insist she leaves to me.  She sold some of the things that we weren’t using, and is giving away much of the rest of it.

One of the things that I have noticed is that the house looks much larger with less things taking up the corners and being in the way.  Cleaning is easier.  I don’t have to actually open my eyes before I get my first cup of coffee to navigate from the bedroom to the kitchen.  I love it so far.

This is how I like everything to work.  If I don’t need it, then generally, I don’t want it.  I’m not a good consumer, because I tend to be happy if something does what I want for it to do.  I don’t need it because it’s the “newest” as long as it is capable of doing what I desire, and often this leads me to products that I am madly in love with because of the quality that they perform the job with.  I still have yet to find a laptop that I prefer to my old IBM Thinkpad.

So for a quick summary of how I compute, here’s the layout for the systems that I regularly use:  For LinuxBBQ RSI, Arch, and killx…I use emacs, w3m, vi, tmux, and primarily emacs plug-ins and bash/zsh scripts for everything that I do on them.  On RSI, I use fbterm because pretty pictures and text are enjoyable.  On vanilla Sid, I use cwm coupled with a dynamic tiler that I hacked together, because I can switch back and forth between stacking and tiling, and do both well.  I’m still running dwm on BBQ-Elektra, but I don’t use it very often anymore.  These setups range greatly in how they look, but the basic premise is always the same.  They do what I want from them without doing much else.

Simplifying the environment that you exist in can range from not having to deal with gtk3, or stubbing your toes on a decorative end table.  Either way, it makes life better.  Simplify.

sc: the vi of spreadsheets

sc

 

Today, I had a really simple problem that may have formed into a love affair with a very straight-forward program.  One of my hobbies is weightlifting, and I am starting tomorrow on a system to bring up some of my “lagging” lifts.  The plan requires me to work with percentages of my personal best lifts, and so I needed a simple way to keep up with the percentages and edit them all as one number changes.  This seemed like a good job for a spreadsheet.

While I’d normally just use emacs ses-mode, I thought I’d give sc a shot.  I already work with ncurses development libs, so I only had to spare about 300k worth of HDD space to play with it.  In my normal fashion, I simply made a file and started trying to edit it without reading anything at all.  After failing to enter a single line of text about 5 times in a row, I decided to actually read the man pages.

The keybinds are very vi-influenced, so after reading a few of the commands, I once again started to play with it and kept trying vi sequences to edit fields.  This worked almost 90% of the time, so it wasn’t long before I had exactly what I was looking for.  This project was never intended to be complicated, so I didn’t need to look into what all sc was capable of to fill my needs.  It would appear to be significantly more powerful that I needed.

While I’m very comfortable with elisp equations, I can see how the syntax might be daunting to someone who had not adjusted to them.  For that very reason, I think that I’d recommend sc over ses-mode for simple spreadsheet work.  I have no intention of sharing my sheet with coworkers, or doing highly secretive banking with it, and so for this particular purpose I can say that I was very satisfied with what I ended up with as a final product.

I’m sure that for people who actually use spreadsheets for work, my opinion on the matter is probably not that useful.  I’d need a “spreadsheet specialist” to tell me any more about things that are missing, or buggy, or whatever.  That being said, for the majority of what I’d actually use a spreadsheet for, sc does an amazing job.  If you’re familiar with vi, the learning curve isn’t too aggressive.  On top of that, it’s a very simple program that is very easy on system resources.

I’m really glad that I gave this one a fair shot.