Skip navigation

Author Archives: Cats & Code

I encountered a rather obscure IE bug at work this week. After coding up a series of text inputs that enable/disable other radio buttons based on their values, I was dismayed to find out my javascript code wasn’t working in a test environment. I opened Internet Explorer’s F12 Developer Tools to check for any errors.  Unfortunately, the javascript console was useless; nothing was there! No errors, no warnings, nothing. Maybe the console hadn’t recorded an error because it wasn’t open at the time or something, I thought. So I entered another value and found that the javascript was all of a sudden activated! The radio buttons were disabled appropriately and the console was displaying my console.log statements. I was confused. I attempted several other scenarios and variations of input, attempting to lock down what exactly was causing my code to become active. No luck. Fortunately, a colleague of mine had experienced this exact problem before and let me know how to fix it.

Simply remove the console.log lines.

Did you know that console.log does not work in IE unless you have the F12 Dev Tools open?! I sure as hell didn’t. And not only doesn’t work, it breaks other javascript on the page too.

Here is a stackoverflow link that explains the issue in more detail.

So I learned my lesson here, be certain that all javascript debug statements are removed from my code before delivering it.

Inside WordPress’s WYSIWYG editor, if you start a line of text, inside a code tag with a bad character, in my case, the ‘#’ character, the output gets screwed up. Here’s how to fix it.

If anyone looked at my previous post in the first week or two after I posted it, you may have noticed how ugly the post was when it came to the displaying my console’s output. This is because I put the console’s output inside a <code> tag in the WordPress WYSIWYG editor. My intent was to differentiate my blog post text from the output of the console I was using as an example. However, after wrapping the console output in the tag, this is what it looked like:

Code Tag with Incorrect Output

 

I have successfully used the <code> tag in WordPress before to stylize blocks of text, so that’s what I was using here to represent the console output. I didn’t have any trouble in any previous posts, so I wasn’t sure what was different this time. After experimenting a bit, I found that the “#” character that begins each line was the culprit.

code tag with a bad character at the start of each line

 

The fix was simple, I added a space before each line.

Code tag with a bad character, but prepended with spaces

 

This small change gave the output the desired effect.

Code output with corrrect formatting

 

My first open source contribution was just accepted! (Technically, it’s my 2nd contribution, but 1st accepted one!) It was a very minor change, just an update to the user’s profile page on GitLab. Nevertheless, it’s a great feeling, knowing that I can directly improve a great product that I actually use. And I gained a bit of git experience because of this as well.

simple git history

So pretty!

Before I get to what I broke and how I fixed it, I feel the need to defend my self a little first. I like seeing the pretty, flowy lines that are used to track the commits of a project in git. However, the entirety of my git experience has been as a solo developer. As such, I have noticed that every time I merged my feature branch into master, there was no fork reflected in the history (using git log --oneline --decorate --graph). Just a boring-ass straight line. So, purely for aesthetics and not at all efficient or appropriate, I found a way I could get this fork & branch history to show in my logs. I performed my merges with no fast-forwards (git merge --no-ff). This creates an additional commit that mentions the two branches that are merged together, but achieved the effect I was looking for.

So fast-forward (HA!) to today. I followed my usual, ignorant workflow for my submission to the Gitlab project. I forked their repo, cloned locally, branched, made changes, merged with no-ff back to master, and pushed back to origin. From here, I made a merge request for my change to the official GitLab Community Edition project. An admin wrote back that I needed to squash my two commit messages into one. Blerg, I should have known that this practice would eventually bite me! I spent the next few hours unsuccessfully trying to figure this out on my own. After responding to the admin the next day, that I was still working on this, he wrote back with a very helpful link.
So, to squash my pointless commit that --no-ff generated, I had to rebase back to the commit before my first commit. I did this, in interactive mode (git rebase -i <hash of commit>). This opens a text editor with the history of commits between the latest commit and the one that was selected for rebase. In my case, saw my commit, and the autogenerated commit. From the text editor that appears, I can then select what happens with each commit.

# Rebase 408a140..5d5af41 onto 408a140
#
# Commands:
# p, pick = use commit
# r, reword = use commit, but edit the commit message
# e, edit = use commit, but stop for amending
# s, squash = use commit, but meld into previous commit
# f, fixup = like "squash", but discard this commit's log message
# x, exec = run command (the rest of the line) using shell
#
# These lines can be re-ordered; they are executed from top to bottom.
#
# If you remove a line here THAT COMMIT WILL BE LOST.
#
# However, if you remove everything, the rebase will be aborted.
#
# Note that empty commits are commented out

My case was simple. I chose fixup for the superfluous commit and pick for my commit. After saving these changes, the rebase occurred. Everything is fine now, right?
Nope, my dumbass pushed the change to the remote repo already, so I had to overwrite that master branch with my new master branch. A normal git push wouldn’t do it because my local and remote repos were out of sync. Some kind of fast-forward error. So I just forced my changes to overwrite my remote master branch.

git push --force

Ha ha ha Force Push!

git push --force origin master

This force push added my revised master branch to my remote and let me successfully merge my change into the GitLab source! Awesome! Proof positive that you don’t have to be some sort of programming genius to be able to contribute. I don’t even know Ruby!

I discovered Vagrant earlier this week and have become immediately smitten. As I bounce around from my PC to my laptop to my work laptop, I have a lot of dev environments to manage. Vagrant manages all of that business for me now!
The TL;DR of this post is that you need to be sure you are running the most recent version of VirtualBox. Version 4.3.16 as of this post. I was running version 4.3.14 and still had trouble.

I have a Dell Latitude with the usual integrated Intel graphics w/ additional Nvidia or AMD graphics card configuration. In my case, it’s an Nvidia card. I have it docked in a replicator and outputting to 2 monitors and the laptop screen.
Vagrant is supposed to make managing VMs super-easy, so I was a bit confused as to why I couldn’t even boot a vanilla Ubuntu/trusty64 vm. From a DOS prompt, my “vagrant up” would yield:

...everything looks good...
==> default: Booting VM...
==> default: Waiting for machine to boot. This may take a few minutes...
The guest machine entered an invalid state while waiting for it
to boot. Valid states are 'starting, running'. The machine is in the
'poweroff' state. Please verify everything is configured
properly and try again.

If the provider you're using has a GUI that comes with it,
it is often helpful to open that and watch the machine, since the
GUI often has more helpful error messages than Vagrant can retrieve.
For example, if you're using VirtualBox, run vagrant up while the
VirtualBox GUI is open.

The error says to try it again with the VirtualBox GUI open, so I do that. But as soon as I open VirtualBox, I get a popup with another error.
VirtualBox.exe - Bad Image
C:\Program Files\NVIDIA Corporation\CoProcManager\detoured.dll is either not designed to run on Windows or it contains an error. Try installing the program again using the original installation media or contact your system administrator or the software vendor for support.

With some quick DuckDuckGo-fu, I find a VirtualBox ticket detailing exactly my issue, specifically comment #3. The last comment has the solution, just update to the newest version of VirtualBox. Just be forewarned, if you are lazy like me, you will try the “Check for Updates” feature inside VirtualBox and find that you are supposedly running the most recent version. But this feature doesn’t take into account tiny sub-dot releases. You will have to go directly to the VirtualBox website to download the latest.

After version 4.3.16 was installed, both issues disappeared. The Nvidia dll error upon opening VirtualBox as well as the command-line issue that was preventing vagrant from starting my VM.

I’ve been working with a colleague on getting a logo designed for the game company he has recently started. The designer that was hired used a font called “Exo” in the logo. The source file was given to us in a .svg file. And in order to view the logo with the correct font, I first had to install the Exo fontface on my Ubuntu installation.
Here’s how I did it.
First, find the fontface for download somewhere. We ended up finding it on fontsquirrel.com and downloading it in .zip format. Next you will need to create a .fonts folder under your /home/<user-name>/ folder. Inside this .fonts folder, create an ‘exo’ folder to store the actual OTF files. Extract the files from the exo.zip file into /home/<user-name>/.fonts/exo. Finally, run the following command:
sudo fc-cache -f -v

You should see all the font caches on your machine getting refreshed, including the new .fonts folder.
After this, I opened our logo.svg file in Inkscape and the correct font was applied.

Easy as that!

So I’m investigating an Eclipse error I’m getting on my dev laptop and need to search for a bit of text in a file that may be in a hidden file somewhere in my workspace. I don’t trust Eclipse’s global File search (Ctrl + H) to be thorough enough for my needs. I need to do this via the OS.
The “Grep” command is the solution.

It is used like so:


sudo grep ""

For example:

sudo grep "IOException" *.log

Now, my particular issue meant that I didn’t know which folder the file would be located in. To get around that, I used the recursive argument or ‘-r’:

sudo grep -r "IOException" *.log

This loops through all the files in the current directory, as well as sub-directories of the current directory.
At least that’s how I understand it to work. Feel free to correct me if that’s not the case.

I am not sure how long Github has offered their Pages service, but I only recently discovered it. And, as I have been trying to improve my workflow, I found the idea of deploying my site with a simple “git push” very enticing.
I experimented briefly with Pages and decided to go with Github for my webpage hosting. However, I quickly realized I had a problem. I had an existing Github repo that housed the code for bencarson.net. I also had a Github repo in ben-carson.github.io that I had created for my Pages experimentation. I didn’t want to lose the history of my original website project. And I also didn’t really want to just copy-and-paste my original site’s files over my Pages repo, that just felt too inelegant. What I really wanted was to combine these two distinct git projects and their histories into one project.
My Git-fu is still pretty weak, so I took to the web for help. My initial searches provided results on merging subtrees and modules and whatnot. More along the lines of keeping a library that a project uses up-to-date, rather than a one-time project meld. Way overkill for my needs. Then I found this post. It was close to exactly what I was looking for.
For the sake of clarity, I’ll include just the (DOS) commands I ran for this process:

C:\> mkdir bencarson-website

C:\> cd bencarson-website

C:\bencarson-website> git init
Initialized empty Git repository in c:/dev/workspace/blog-post/.git/

C:\bencarson-website> dir > deleteme.txt

C:\bencarson-website> git add .

C:\bencarson-website> git commit -m “Initial commit”

C:\bencarson-website> git remote add bc-gh-io-remote https://ben-carson@github.com/ben-carson/ben-carson.github.io.git

C:\bencarson-website> get fetch bc-gh-io-remote
warning: no common commits
remote: Counting objects: 73, done.
remote: Compressing objects: 100% (55/55), done.
remote: Total 73 (delta 11), reused 69 (delta 10)
Unpacking objects: 100% (73/73), done.
From https://github.com/ben-carson/ben-carson.github.io
* [new branch] master -> bc-gh-io-remote/master

C:\bencarson-website> git merge bc-gh-io-remote/master

C:\bencarson-website> git rm deleteme.txt
rm ‘deleteme.txt’

C:\bencarson-website> git commit -m “removing garbage file”
[master 05b4839] removing garbage file
1 file changed, 10 deletions(-)
delete mode 100644 deleteme.txt

#combine the ‘remote add’ and ‘fetch’ commands with the ‘-f’ parameter
C:\bencarson-website> git remote add -f bc-net-remote https://ben-carson@github.com/ben-carson/bencarson.net.git
Updating bc-net-remote
From https://github.com/ben-carson/bencarson.net
* [new branch] master -> bc-net-remote/master

C:\bencarson-website> git merge bc-net-remote/master

C:\bencarson-website> git remote remove bc-net-remote

C:\bencarson-website> git remote rename bc-gh-io-remote origin

C:\bencarson-website> git push origin master

After this, my new, history-merged, site was pushed up to ben-carson.github.io and is available[Edit 03.10.15: removed all my github stuff]! Super easy, once you know how to do it.

image

Any idea what it is?

I bought a Raspberry Pi for myself this Fall in order to play around with the minicomputer that is so popular these days as an opportunity to teach myself a little more about Linux, hardware, and networking. I read many many posts about lots of individuals running the Pi as a media server for their first experience with this computer. Seemed like a reasonable place for a Pi noob like myself to start as well…

If this previous paragraph sounds anything like you, consider this post a warning.

TL;DR version: DON’T! Just because you can do something, doesn’t mean you should.

  1. Install RaspBMC to SD card
  2. Boot Pi
  3. Wait for OS to initialize
  4. Realize process is frozen
  5. Reboot Pi
  6. See that installation is corrupted because previous attempt to start failed
  7. Go back to step 1
  8. Continue previous steps until the XBMC interface actually installs all updates without freezing or corrupting itself
  9. Try to do something once interface is loaded
  10. Oops, you moved the mouse too quickly or used the wifi connection and pulled too much power; system is hung
  11. Reboot
  12. Continue this process until you can get to the “Plugins store”
  13. Browse available plugins
  14. Peruse the vast wasteland of useless XBMC plugins
  15. Realize that nothing worth loading runs on an ARM processor (Want Netflix/Hulu/CW/ABC/etc? Too bad!)
  16. Say fuck it all and install something else your SD card. This thing wasn’t meant to run XBMC

If you want a media player, just buy a commercial product like a Roku 3 or Apple TV. By the time you’ve invested enough time, money, and energy in just getting what the Pi needs to run (case, 5V/1A+ PS, USB cable, HDMI cable, powered USB hub, SD card, mouse, keyboard, wifi dongle, etc), you will have spent MORE on an inferior product!

But wait a minute, this was supposed to be a learning exercise for you, right? Didn’t you at least learn something about the hardware or Linux? Well maybe, I did learn how to get my TV tuner working on my laptop, in an attempt to get it working on the Pi. So there’s that. But the time I wasted and frustration I endured was absolutely not worth learning that bit.

I’ve got Raspbian wheezy, a Debian derivative for ARM processors, installed on it now. That is running great so far. Probably just going to keep it there too. I’m still running Mint, also a Debian-derivative. Its easier to keep everything the same for the sake of learning

Still here. Just limping along without an internet connection at the moment.