MythTV 0.29 Video Scripts Update

Whew! Been many changes since my last. One of which was a new house, which means a new computer area, which means a new, beefier computer!

I installed the latest (at the time) stable version of MythTV and used my scripts from before. It looked like it encoded the TV shows properly (an MP4 file was spat out), but ultimately it wasn’t put into the database, and therefore didn’t help me at all. It did finally dawn on me that my old setup with 0.28. From what I can tell, there was a change between the two versions of MythTV that a field from the old version (namely, the recordedid field from the recordedfile table https://www.mythtv.org/wiki/Recordedfile_table) that was simply being copied and used when inserting the new recording into the database. This field is marked “unique,” and so it would fail upon inserting. Hodge-podgeing something together, inserting “my @recKeys = grep(!/^recordedid$/, keys %$ref);” before building and executing the SQL insertion command, leaving that field a NULL value, worked. Looked like that the DB would be smart enough to assign it the next unique value and it would keep on keepin’ on.

All that said, I’ve not upgraded to v 0.30 of MythTV, so I’m not sure if the DB has changed any. If anyone has tried the scripts on the newest version, I’d be curious to know if it worked or needed any more tweaks.

Updated scripts at: https://github.com/gomezdevelopment/video-scripts/tree/master/MythTV

Posted in Uncategorized | Tagged , , | Leave a comment

Windows 10 Upgrade

For a few months, I have been trying to get Windows 10 to upgrade on my laptop. I was able to do a clean install via a USB drive, but the upgrade wasn’t working – it’d get through step 1 (copying files), but die in step 2 (drivers setup). The error code that kept coming up was: C1900101-30018. The best description of this error comes from MyCE: “The C1900101-30018 error appears to be caused by software that deeply integrates with the system, such as antivirus products, hardware drivers and software that claim to increase system performance.” The recommendation is to remove AVG’s TuneUp Utilities (which I didn’t have). The ONLY things I had installed were the base Windows 7 OS (from Microsoft, so no bloatware – which is especially an issue with my laptop being the Alienware M14X R2 – Dell bloat + Alienware bloat. What ever happened to a plain, simple OS that you customize and configure and tweak yourself? Oh yeah, that’s Linux. 🙂 ), network driver (for wifi, bluetooth, and ethernet – none are recognized without the driver), and all of the hotfixes that Win7 had.

It occurred to me this morning that my network card requires a special driver – specifically, the

I un-installed the drivers, leaving me with no networking of any kind (but I already made a Windows 10 USB drive before), so I gave it a shot. Windows 10 now is installed on my laptop and is validated through Microsoft. What’s even better is the wireless card is supported right out of the box. Whoo hoo! No more bloat driver!

People may be asking why I did Win 10 at all – namely, for games. I have a mSATA drive and a 2.5″ HDD in my laptop, with the mSATA (which is the larger capacity of the two) running Windows, while the HDD has Linux on it. I typically boot into Linux, but for the times I need to play games, I have a full OS that is capable of gaming within a few seconds. I upgraded to 10, because I needed to do a fresh install anyway – my Win 7 would blue screen on me even after a clean install. I am wondering if that was the result of the “Killer” wireless drivers really killing Windows in the process. And lastly, it was free. I will be locking down Windows 10 when I get some time tonight to fix the privacy issues.

Long story short: successful Windows 10 upgrade with Alienware M14X R2 laptop, error code C1900101-30018 being the result of Qualcomm Killer Network driver and software.

Posted in Uncategorized | Tagged , , , , , | Leave a comment

Migrating My Linux Install To An SSD

Last year, I had the pleasure of being in my former roommate’s wedding. As a “thank you,” he gave me the nerd’s credit card – a gift certificate to NewEgg.com. I sat on it for nearly 6 months before finally giving into the temptation and buying a Solid State Drive. As I have programs compiled, configuration files done, non-standard programs installed, I thought this would be a great reason to do a migration rather than a clean install.

I started by searching for my bootable thumbdrive with Ubuntu 14.04 32-bit LTS. Couldn’t find it. Bonus tip: to create a bootable thumbdrive, just use:
sudo dd if=/path/to/ubuntu/image.iso of=/dev/sdX bs=512k, where X is the thumbdrive’s device letter in /dev/. Totally works!

Anyways, got a 32-bit version going. Created a new partition table, created an ext4 partition that filled up the drive, and mounted it at ~/new and the old at ~/real original, huh? Did a mass-rsync with:

rsync -Pa ~/old ~/new

Once that was done, I needed to install the GRUB2 bootloader. There wasn’t a direct way to do it, but you can use the chroot method to do it:

sudo mount –bind /dev ~/new/dev
sudo mount –bind /proc ~/new/proc
sudo mount –bind /sys ~/new/sys
sudo chroot ~/new

This was working great, but ran into a problem. I was attempting to use a 32-bit version to write a bootloader with a 64-bit version of Ubuntu. Take away: if you use a 64-bit OS, use a 64-bit live CD or USB!

Once this is done, run:

sudo update-grub2
sudo grub-install /dev/sdX

Again, where X is the letter of the new SSD.

Now, we have a bootable SSD with all of your data back on it. BUT there are still a few more things that need to be done in order to finish up. Still in the terminal, type:

sudo blkid /dev/sdX

and make note of the UUID of your new SSD.

sudo vim /etc/fstab

(replace vim with gedit, nano, other preferred editor). You should see all of your mountpoints, including the information of your old drive pointing to “/” in a long UUID. Replace the old one with the new one (or, comment it out and copy/paste in a new line with your new SSD UUID).

Now, one final task. If you’ve read anything about SSDs, there is talk about TRIMing the drives. This is the OS’es way of identifying which pages (which are made up of blocks, which are large compared to sectors in a hard drive) in the SSD are empty and can be written to. With a traditional HDD, when a file is “deleted,” the sectors that correspond to the file are marked as safe to overwrite. Until that time, the files CAN be recovered – a sometimes necessary act that will make you happy you knew how to do it. An SSD will mark the blocks as available to be rewritten, but the process of finding a block that is marked as free, copying the blocks that still hold data of a page to an internal buffer, and rewriting the page with new information in place take a significant amount of time to accomplish.

It can be done two ways in linux — automatically upon file deletion (which is specified with the “discard” option in the fstab) or with a cron job. The discard option seems like the better choice off the bat, as it is performed as soon as a file is deleted. This however can lead to performance decreases if you do this with a large number of smaller files. A single command will run the TRIM command on the file system and do this for you though.

sudo fstrim -v /

Where “/” is your mount point. Since the SSD is our root mount, it is /. Run:

sudo vim /etc/cron.daily/fstrim

and use whatever you’d like to execute that command (outputting to a log file, not being verbose, different mount points, comments, etc). After this is done, mark it as executable:

sudo chmod 755 /etc/cron.daily/fstrim

I personally prefer the hex/binary method of entry for chmod, as they make perfect sense to me. Read/Write/Execute in binary order for the User, Group, and Everyone.

Now, we are done. Log out of the chroot environment, unmount your ~/new/dev, ~/new/sys, ~/new/proc, and ~/new directories, and reboot. Tada!

My desktop is a Core2Duo E6400 w/ 4GB of RAM. My linux OS WAS on a slower, 120GB IDE HDD, which also conatins my Windows 7 install. Because I more often use my Linux install, I chose to migrate it over instead of the Windows install. I am able to boot into either Windows or the old Linux installs off of the IDE or the new, Linux install on the SSD. My boot times went from 1:20+ to boot to 0:19 after GRUB. The vast majority of this tutorial came from this site, but I did not change the SSD’s UUID – I instead modified the fstab file to accommodate the SSD’s UUID so I could use boot back to my HDD-based Windows 7 install.

Posted in Uncategorized | 1 Comment

Truecrypt Corrupted Container File (And How I Breathed Easy Again)

Truecrypt is a very cool way to store files in an AES or other cyptographically secure container that can be used to encrypt your documents or other files. At home, I have a large, 2 GB truecrypt container that I use to file all of my scanned bills in in an attempt for a paperless filing system. One day, I was adding some bills. I wanted to make sure I had the most recent copy from my backup server, so I used rsync to start pulling it down before I started. 1:1. Awesome. Added my files on the local box, and used the same rsync command to push it back up… NO! WAIT!! Ctrl+C!! I was pulling from the Backup server again, so all of my additions would be over-written. OK, now let me change the directory and source and… OK off it goes. Waitaminute.. this is even worse… rsync complete….

I just messed up the entire container by using two different files into a single, one… Not. Good. And since we’re dealing with crypto, one bit of difference results in a HUGE change.

Try mounting it:
“device-mapper: resume ioctl failed: Invalid argument. Command Failed”

… uh…. dang.

OK… what can I do now? May I can recover the original? I tried the using the awesome testdisk/photorec tools to recover the .tc file, but there’s a problem. Truecrypt contains an encrypted key, information about encryption algorithms, and partition information in the headers of each .tc, so there really isn’t a set container header flag like there is with JPEGs.

After some reading, I resolve to recreate it the best I can using my offsite backup which was temporarily down and didn’t contain the most recent version.

A side note: I use rsync to make backups of my files to my offsite, through SSH. rsync uses the date/time information on a file to determine what to send by default, as it is extremely fast. Using rsync to update and transfer truecrypt containers pose a problem however, as the date modified doesn’t always update when mounting in linux. This can either be resolved by using the command “touch -m container.tc” to update the time modified timestamp for the container, and thus making rsync force a check on it, or by using the “-c” option for rsync to compare by checksums rather than timestamps. It is significantly slower, as both ends need to do an MD5 checksum of the file and compare, but it is better than arbitrarily updating the date modified IMHO. And now, back to our original programming.

The problem was using an old copy of my bills container was it was missing all of 2013 and some other bills I filed and sorted throughout since 2010… I make a copy of now screwed up container and label it as the corrupted version. I figure that I MIGHT be able to do something with it later on down the road and recover SOME of the files. I look around some more, and find that I can use a hex editor to go into the free-space of the drive and I might be able recover the original container… OK, not exactly ideal and I might not even be able to recover it at all. I resolve myself that I just lost ~ 1GB of bills that I may need to reference forever. I cannot scan these in again, as the bills have been used to make confetti by the almighty paper shredder, and the thumbdrive I used to scan them to has since been overwritten.

Last resort, I see this post suggesting I use a mount option (-m=nokernelcrypto). I give it a shot. And it works. HALLELUJAH! I copy all of the files to the newly created bills.tc and backup like crazy to my offsite.

Lessons learned:

  • If you are getting a “device-mapper: resume ioctl failed: Invalid argument. Command Failed”, try the -m=nokernelcrypto option.
  • Double check your rsync source/destinations.
  • Because rsync goes by date/timestamps, and the Truecrypt container may not be updated, either do a “touch -m container.tc” or “rsync -c” option to go by checksum and not by day/time.
  • I’d take 1 or 2 corrupted PDFs rather than lose ~1GB of data.
Posted in Uncategorized | Leave a comment

MythTV x264 MP4 Compression Scripts

I recently setup a MythTV setup at my house and use a PS3 to stream over WiFi to my TV and sound system. Because stations broadcast digital signals now, what is broadcasted and received are 100% identical (provided no signal loss). The codec that my TV card captures in is straight MPEG-2 at a very high bitrate. “How high is it?” Well, I’m glad you asked! Depending on the station and program, a 1-hour show (with commercials) can run to just over 8 GB. Run some numbers, and this is combined bitrate is over 18.6 Mbps, or over 2.3 MB/s. 2.3 Megabytes a second. For reference, the absolute max that a video DVD can provide is 10.08 Mbps for both audio and video. (http://en.wikipedia.org/wiki/DVD-Video#Data_rate) It looks fantastic. The problem I run into though is wireless bandwidth. The PS3 only has 802.11g wireless (54 Mbps theoretical max). Take into count that that maximum is combine sending and receiving, signal degradation depending on countless factors, and other usage on the wireless network, and you are left being good to push 1 MB/s. Video compression to the rescue!

Continue reading

Posted in Uncategorized | Tagged , , , , | 13 Comments

Custom Spinner in Android To Change Background and Text Colors

When writing a recent app, I wanted to use a spinner. No problem! Worked great in the emulator – grey background with black text. I checked it out on a few physical devices and I was getting white text on a grey background. Absolutely useless. I found way so changing the text inside of the selected box, but was still getting white text on a grey background when I was presented with the options to select from. Everything I had found essentially said to me “write your own!” but wasn’t given any idea of HOW to do that. Well, I’m going to share how I wrote my own.

I already like the layout that the simple_dropdown_item_1line option provided. My code looked like so:

ArrayAdapter<String> adapter = new ArrayAdapter<String>(getApplicationContext(), android.R.layout.simple_dropdown_item_1line, list);

In Eclipse, hold control and click on the simple_dropdown_item_1line item. This brought up the code that was originally used. Aha! Now, just need to copy this and make my own and reference it! I copied the body of the simple_dropdown_item_1line.xml file to my res/layout/ directory and modified it to add in the following:

android:textColor=”#000000″
android:background=”#ddd”

the #ddd is the grey color that is used in the spinner by normal, and the #000000 is to specify black color text. With that modification in place, time to change my reference from the built-in android to my custom spinner now. I changed the array adapter to the following:

ArrayAdapter<String> adapter = new ArrayAdapter<String>(getApplicationContext(), R.layout.dropdown_item_1line, list);

In case you didn’t see the difference, it is not using android.R.layout… but the local R.layout… instead. The color changes in the class change the selected text and background as well as the dialog text and background.

With that 4 minute change, I had fixed the thing that was bothering me! Hope this helps someone else out there!

Edit: Whoops! Never realized that I had the wrong colors mentioned when using the emulator. Fixed!

Posted in Uncategorized | Tagged , , , , | Leave a comment

First Post!

Hello everyone and welcome! This site will be the page for all things related to my development notes and Android apps available on the Google Play store. A new app will be available soon, so check back soon!

Posted in Uncategorized | Leave a comment