Help Secure Everyone’s Email by Encrypting
Previously I wrote about the protection I am adding to my mail by using PGP or GPG. You can find the article by clicking here. My involvement with the EFF and AVNation have also included comments about privacy: AVNation Privacy & EFF Mail Links.
Something I realized while thinking about this subject is that if one sends very few encrypted e-mails, the ones that are encrypted will stand out in the mail being sent. Now you might wonder what I am doing that requires encrypting. The previous blog post explains why I am encrypting my mail.
I have an additional reason now, confuse the government and anyone else monitoring traffic. This idea is discussed in Cory Doctorow’s book Little Brother http://craphound.com/littlebrother.The section below is used under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 license. This quote below came from line 1826 in the HTML version available on Mr. Doctorow’s website.
“So how come you weren’t on Xnet last night?”
I was grateful for the distraction. I explained it all to him, the Bayesian stuff and my fear that we couldn’t go on using Xnet the way we had been without getting nabbed. He listened thoughtfully.
“I see what you’re saying. The problem is that if there’s too much crypto in someone’s Internet connection, they’ll stand out as unusual. But if you don’t encrypt, you’ll make it easy for the bad guys to wiretap you.”
“Yeah,” I said. “I’ve been trying to figure it out all day. Maybe we could slow the connection down, spread it out over more peoples’ accounts –”
“Won’t work,” he said. “To get it slow enough to vanish into the noise, you’d have to basically shut down the network, which isn’t an option.”
“You’re right,” I said. “But what else can we do?”
“What if we changed the definition of normal?”
And that was why Jolu got hired to work at Pigspleen when he was 12. Give him a problem with two bad solutions and he’d figure out a third totally different solution based on throwing away all your assumptions. I nodded vigorously. “Go on, tell me.”
“What if the average San Francisco Internet user had a lot more crypto in his average day on the Internet? If we could change the split so it’s more like fifty-fifty cleartext to ciphertext, then the users that supply the Xnet would just look like normal.”
“But how do we do that? People just don’t care enough about their privacy to surf the net through an encrypted link. They don’t see why it matters if eavesdroppers know what they’re googling for.”
“Yeah, but web-pages are small amounts of traffic. If we got people to routinely download a few giant encrypted files every day, that would create as much ciphertext as thousands of web-pages.”
This action is a relatively small action and is rather simple to do. However, the fact that it will change the traffic view could be helpful for others. It will prevent other PGP/GPG encrypted traffic from being such an outlier as to be noticed. As EFF posted on Data Privacy Day, privacy is a team sport. There are additional directions for how to do this task at https://ssd.eff.org/, hover over the tutorials section. If you want to test if it worked, My public key identifier is C93A52C6. You can download my public key from directly from my site.
I also will freely admit, I am not sure if it will make a difference, but it could not hurt.
January 31, 2017
Metadata is your friend
Previously I wrote about how one can store too much data. I was guilty of that personally. I have way too much data hard to sort through easily. This collection of data was not just images I have taken; it is also documents, spreadsheets, and presentations. What is often overlooked is that there are tools out there to address the issue head on, but most of us don’t use them. It is the power of Metadata.
For those of you not familiar with the term metadata, it is data about data. Yes, that sentence is circular, on purpose. Metadata is a way to describe data using additional data. A few examples to illustrate the idea could be thought of as the “Tag Cloud” to the right on this blog. I manually go in and add descriptive tags for each post so that people can find them easily. That is just one example; another one is key wording or captions in pictures. The actual data is the image itself; the metadata is describing the data that is contained within the image.
The key is to actually fill it out and use the metadata options in pieces of software. This can make finding something much later, much easier. Metadata is not limited just to photographs and blog posts; the much-maligned Microsoft Office products include the ability to add metadata to the file. Microsoft does not call metadata “metadata”, they call it “Properties”. This data can be very helpful.
Let’s say that you were writing a letter to an airline about the difficulties you had with booking a flight with frequent flier miles. Now when you save the file, you might give it a filename such as “United July 2011”. Now later you go looking for the file, will you be able to find it just based on the filename? What happens for something less directly identifiable? It becomes a little harder. However if I add a brief sentence that says “Correspondence about trouble booking a flight using frequent flier middles” and put in keywords of “United, Frequent Flier, Reward, Travel”. Both Mac and Windows operating systems provide utilities to find files using metadata. It is the search tool built into the finder.
You might be thinking to yourself, “I do not need to do all this extra work, I can keep track of my files.” I would like to leave you with perhaps the most compelling reason to fill out your metadata – media files.
All the MP3 and other media files that are organized in iTunes are organized using metadata. Can you imagine how difficult it would be to go through 4,730 files to find one specific piece of media? How about if you have multiple versions of the same song? Without metadata, media management would be very difficult.
Now if you will excuse me, I have to go fill in some document properties.
Click for larger image
Making the interface work for me
Often times the controls for a piece of software are not the friendliest locations for one-handed operation. By one-handed operation I mean one hand on the keyboard, one hand on the mouse. When working in graphic programs I find myself working that way quite often. It could be as basic as a drawing program where I need to use the Z key to initiate the zoom function and then using the mouse to decide where to zoom. Other times it is more complex, such as selecting an image, zooming into a one pixel to one pixel rendering, panning, and then marking the image as a keeper or a chucker. It could just as likely be a drawing program where I am documenting an idea. For my #AVTweeps, just think AutoCAD.
Recently I found myself being sore at the end of an image review session from unnatural movements. My data management workflow is outlined at previous blog post. However looking at the actual process I began to find lots of moving of the hands. My review process is based around the use of Adobe® Photoshop® Lightroom® (quite the mouthful so Lightroom for short). The program itself is very powerful and does help me manage my images, pictures, and photos. The program lacks some ergonomics for the one handed user.
The way I cull images is I go into the library mode and review the images at a resolution to fit onto the screen. I then quickly look at it and decided if it is a Pick, Unmarked, or a Reject. These selections are done using the P U and X keys. Notice how they are laid out on the keyboard.
Not very easy to navigate with one hand. Now let’s say I want to zoom into an area, one can either use the mouse to enter a 1:1 view or press shift and spacebar to enter the same mode, then use the mouse to zoom to areas. I do this to see how much aberration is viewable and if it is in focus, once again I decide if it is a pick, unfledged, or rejected. Lightroom has a setting to advance to the next image after assigning a value to the image.
That setting seems like it would save time, and it does quite often. However if I want to assign two things to an image, I have to back up to the image. If I find an image of the same subject later in the batch that is better than a pick I decided on, I go back to unmarked the previously picked image. So now I have a few options. I can expose the filmstrip at the bottom of the application window and click on it with the mouse and then press U. If this image was just the previous image I can use the arrow keys. If you notice both of these options require me to take my right hand off the mouse and place it on the right half of the keyboard. Now I could also just use my left hand on the right side of the keyboard however that still means changing positions.
Let’s say I want to see if a crop makes an image better. An example of a crop changing an image happened at the baseball game I took pictures at, since I was sitting in the stands some of the images have the back of people’s heads in them. Cropping the heads out made the pictures better, but some were still chuckers not keepers. In Lightroom I enter crop mode by pressing R, this would enter Develop module, where I would use the mouse to make the crop. I would then finish with the crop. I would then want to mark the image as a keeper or chucker. I cannot do that in the Develop mode, I have to be in Library mode. To return to Library mode I would either take my right hand off the mouse to do the keyboard contortions or move the mouse away from the work area. Neither solution is very ergonomic.
There are keyboards available that are designed to fix some of these issues by changing the keyboard layout and having labels on the keyboard. However some are more expensive than the program itself. Also they are dedicated to the program, so I would still need my regular keyboard for such things as entering text. Not really an idea I was looking for.
I started thinking about it more and more and came up with a more practical solution in my not so humble opinion. I purchased a customizable gamer keypad, a Logitech G13 Programmable Gameboard with LCD Display as it is Mac compatible – yes it is also Windows compatible. (If you decide to buy one after reading my blog, using this link will give me a little commission.) This would let me decide how the keystrokes would be used. I could lay them out to my satisfaction.
I then determined what keys I used most. They are both left and right handed, and some of them require multiple hands, such as entering Library Mode (Command + Option + 1).
These main keys were then assigned to the keypad as I found would work best for me. (Drop me a line if you would like to copy of the configuration file.)
I had 200 plus images from a business trip and figured that would be a great way to test it out. So I went through the images, did the rating, cropping, and keywording in about an hour including uploading to a SmugMug gallery. There was another benefit that occurred that was unexpected, I was able to hide all of the tool palettes in Lightroom so the images were bigger on the screen during the review, remember bigger is better. I do not have exact times for similar tasks using the “standard” keyboard commands but the important thing is I was not sore and it was not as tiring to me.
The keypad allowed the thing that I think all tools should do, get out of the way and let me work. It did just that. Other than when I had to type in keywords, I used just the keypad and the mouse. I did not have to move my hands around the keyboard and mouse.
I also learned a couple more tricks in the process. I can use the keypad in more than one program, but keep the key functions the same. By key function I mean that the same key that sends an R to enter Crop mode in Lightroom can be configured to send a K in Photoshop or Command + K in Preview to perform the crop functions. The same key press to me, sends different keystrokes to the application. Much easier than having to remember all the different commands, similar to Cut, Copy, and Paste being the same in almost every program. That is a fine example of what I was trying to accomplish; cut (Command + X) copy (Command + C) and paste (Command + V) are not great mnemonic devices at first blush but the arrangement of the keys makes it very easy to use.
The medium is as important as the message
Last week I participated in a Twitter chat, an #AVChat hosted by @AVWriter (Linda Seid Frembes) and I was the guest “talker”. It was an interesting experience. I am by no means a digital media expert, but I think that the idea of a Twitter Chat or Tweet Chat or TwitChat is very interesting. The idea of allowing people to connect and share ideas experience and ideas in the virtual world is a good one. Very similar to guild meetings in previous times, it allowed for people in the same trade to share knowledge. It also allowed people to participate how and where they were available with the availability of a replay or transcript. It worked well, however it turned into more question and response then conversations.
I in no way blame the #AVTweeps or Linda. I think it is the medium, Twitter. The idea that one could connect and participate as they preferred to seemed interesting. I am not sure it is appropriate for a moderated talk. Even though we were trying to make the gathering as portable as possible, there were still some vestiges of old school technologies. I was online with five devices with various Twitter clients on them and the best communication tool of the bunch was what I had attached to my head, I was talking on the phone with Linda. Part of this connection approach is that it was a first for both of us to have a moderated chat using Twitter.
What started to happen though was that Linda would tell me the next question and I would prepare my answer so that she could send it out, I could answer it and then discussion would take place. What ended up happening was that she would tweet the question and multiple people would respond. It would be similar to someone asking a presenter for an image of a tree and members of the audience also presenting images of a tree. It was interesting to see everyone’s response, but I am not sure that it is possible to comprehend all that information and respond all at the same time. I found myself reading on one screen and responding on another. The way I was able to not miss questions that were coming back was that Linda would alert me via phone that something went by I should answer.
The medium of Twitter was such that there was too much information flowing. I am not saying that is a bad thing. As I look back and read the transcript I can comprehend more than I did in “real time”. I liken the TwitChat to trying to hold a class in a trade show booth. People are paying attention, there are lots of conversations going on, ancillary distractions occur, and ultimately some information is missed. The information is not ignored and people are not being disrespectful or malevolent it is just that there is so much going on that things are missed.
The same issue occurs in photography, viewing a timed image presentation is not the same as looking at a static image at your own pace. The root cause of the issue is the fact that intervals might not be one that matches the viewer’s speed. The timing could be too slow so that the observer gets bored, too fast so that the image is not truly viewed, or just the amount of images overwhelms the audience. Obviously each person is going to have their own opinion of what is the proper timing.
Given the option of watching a presentation that self advances versus me controlling it, I will pick the viewer controlled one. If I have the choice of going to a gallery and strolling through artwork at my own speed or watching a presentation of the images where I can’t chose the time I will pick the gallery.
Of course there are applications for all of these mediums. A digital picture frame that advances once a minute of snapshots in an office with a large print of a photograph hanging behind the desk to allow for longer viewing could be the perfect solution. (The differences between an image and a photograph.) Having timed previews on the front page of a website while still allowing visitors to browse content on following pages is the same idea. A TwitChat could be just as effective for some as having a formal online seminar with a moderator and a PowerPoint or Keynote presentation.
It is simple a matter of selecting the right medium for the message. There is no one right answer, you have to pick what works for you as a viewer. As a presenter, think about how you can allow your audience determine their preferred method without ruining your message. It is not an easy process and it is sometimes overlooked but it is important to consider.
If you are wondering why I did not say “slideshow” for presentation, is that to me a slide is an image on a transparent media that is placed in a projector or viewer for display. It is its own medium just like analog is different from digital in audio. I wanted to make sure that the idea was clear.
The airplane challenge for help, software, and interfaces
Another blog post written at 32,000 feet as that is when the issue hit me. I have various electronic devices as my dedicated reader knows. I have previously talked about various data access connection challenges. This new challenge is not one of my own doing. It is a poor user experience or use case definition. This problem was illustrated by Amazon and their Kindle applications, but it does not apply to just them. This challenge happens to many applications beyond this example.
I have found a time where the electronic delivery of a book advantages outstrip the disadvantages I previously outlined. This happened with a “for Dummies” book. At work, I am on a software implementation team rolling out a new application package. I wanted the “for Dummies” book for the application. I looked at Amazon and the book was available both in paperback and in Kindle form. The Kindle form was substatianaly less expensive, but the key item was I could get literally instant delivery. While on a conference call I was able to purchase the book, take delivery of it, and reference it during the call. It was very powerful and better than using the Internet search tools as it has high signal to noise and no rabbit trails.
The next day I had a business trip, I had my analog reading material and my electronic versions. On the plane flight I started to truly ready my newly purchased book. It was also the first time I had started to explore some of the Kindle application features. I saw that there were sections of the book that were underlined. Not underlined texted, but a dashed underline. I was not sure what it was at first, but I found out that it meant that other readers had highlighted that passage. The idea of crowd sourced highlighting was intriguing for me; it helps to know what areas one should pay attention to.
I wanted to see what other features were available. My brain needed a little break from thinking about business practices. I was going to use that time to browse through the help file and see what other features were available that I might not be using in the Kindle application. I was airborne when I wanted to do that. I had no Internet access on that flight. As a result of not being connected to the Internet the help file was not available.
That seems very counterintuitive, why would an electronic reading application not include a help file with it? Think about that for a moment. Something that is designed to read document while disconnected from the Intenet is not able to read its own help file while not connected. It is not just Kindle that has this design flaw. Cloudreader, Nook, and iBooks for iPad do not have a help file that is readily available. I am sure that I can continue to list others as well. It also occurs with applications for workstations.
Not all applications are that short sighted. Two applications on my iPad have help that is available offline. iAnnotate and DocsToGo install their help file as a document you can read from within the applications.
Makes perfect sense to me. An application that is designed to be portable, should have supporting documentation that is portable. So for those of you involved in the design and creation of applications, think about the user that is not connected to the Internet. They might want to refer to the supporting documents; you should make it easy for them. The fact that I turned to the help file already means that the application is not intuitive enough. Do not compound the issue by making it difficult to find the help.
Also this concept applies to those of you who are creating custom control interfaces using software created by others. On more occasions than I would care to count I have ended up troubleshooting a control system and having to guess. These guesses could range from what are the IP addresses to connect to the system to what the control system is using for the backend to how to get help.
For the application users, I recommend that you try out your applications before you are traveling with them or disconnected from the Internet to make sure you understand how to use it. The help files might not always be available.
Well the fasten seatbelt sign just came on….
<note this post was recreated after a website crash, good thing I backed it up>
Data Backup and Access
I have found a few things out over the past few weeks that I figure I will share with you my faithful reader. I have had a logic controller failure on my MacBook Pro which meant that I was sans laptop for approximately 10 days. The day after I received it, less than 12 hours later, the cable modem at my house failed.
So between not having my personal laptop and then Internet access being a car ride away, I discovered some items along the way.
- Backing up Data is important, but one also needs access to the data
There are a few other tangential things I have found out as well, such as changes to my photography workflow, online instructions should not be the only instruction, unfettered Internet access can be a key item but those will be separate posts.
Using my backup solutions none of my data was in jeopardy, however using that data was the challenge. I have been using JungleDisk as my incremental off site backup solution. It works very well for me, but has some choices along with it that I was not fully aware of when I made them. Using a block copy approach I could reduce the amount of bandwidth and storage space I use, however this does not come without its tradeoffs. By making this choice I would be unable to browse the files online, I would have to actually restore them using the client software. At the time I did not think that it was a big deal as I figured I could always just install the client on another computer and get all the data back.
A key item here is that it is my off site backup. Too many people think that just having a backup is sufficient. It is not as there are other things to consider than just a hard drive or computer failure. One has to think of other ways that Data can be destroyed: “Someone stole my car! There was an earthquake! A terrible flood! Locusts!!“ Having the data off site makes it much less likely that Data will be lost.
I could have just installed the client on another computer and get all the data back that still was not going to solve all my issues. As a result of not being able to browse the contents, I am going to change my approach yet again.
Some items will be backed up using block copy, other items will be backed up using file copy, and still other items will be backed up to either Mobile Me’s iDisk or to my Dropbox account. You might wonder what data would go to what place and how to keep it all organized, well that is actually fairly easy as long as I make the right decisions when starting. Just by putting files into different locations on my computer they will be backed up in different ways. Placing items into the Documents directory will place them on JungleDisk, placing items in the Dropbox folder will be on Dropbox obviously (still waiting for selective sync before 100% happy with it), and items stored in iDisk will be on MobileMe iDisk.
The key to this approach is to make sure that a file is stored in one location and only one location for live Data. I have often encountered problems where two files have the same name, but different time stamps or on different computers, so how do I know which one is current. Since all of these items are backed up to the "cloud" of the Internet I do not have to worry greatly about the loss of data. I still do backups to DVD and secondary hard drives every so often so that I am not completely at risk. For items that I want to make sure I backup in more than one location, well I have not hit any yet, but using ChronoSync to keep a "Backup" directory in sync is my plan. This will allow me to create a directory in one of the other storage locations that is labeled KeyJDBU (Key JungleDisk Backup items), then I can use ChronoSync to decide what to copy into it and keep in sync.
This approach of also having the key items in iDisk or Dropbox will also allow for the items to be browsable without having to restore all the data. It still does not solve another key issue, do I have the access to the programs to use the data once restored? I found that quite often the answer was no. Most of this situation was my own fault as I chose what format to store the Data in. Once again I could reinstall and have the data back, but that would take a while; especially with the licensing headaches some companies have put in place (that means you Adobe). I am now considering how to handle that issue.
Some times numbers don’t do something justice
So I use SmugMug to host my photos as they have some really cool features and people there. I also started following a few of them on Twitter, and there was a tweet that just made my head hurt, so I sat down to do the math on it. Okay, I also used Wolfram Alpha to help with it.
The Tweet from Baldy stated:
“Whoa! Vincent LaForet‘s new Canon Mark IV vid on SmugMug used over 20 terabytes of bandwidth in 300,000 views in 14 hours.”
So I started to try and figure out how many megabits/second that was so I could compare it to typical network connectivity that I am more familiar with, 100BaseT or Fast Ethernet, and Gigabit Ethernet. Well it just became amazing.
- First I converted 20TB to megabits
- 20TB= 20,000,000,000,000 bytes = 160,000,000,000,000 bits = 160,000,000 megabits.
Yes, that is 160 Billion megabits
- The next thing was to convert hours to seconds
- 14 hours = 840 minutes = 50,400 seconds
- Now to convert to megabits/second
- 160,000,000 megabits/50,400 seconds = 3,174 megabits/second = 3.2 gigabits/second.
So that is pretty freaking fast at to how quickly the data is coming out.
Wolfram Alpha had cool comparisons to put it in context. It is approximately equal to the text content of the Library of Congress. It is approximately equal to 1/8th of the estimated data content of the surface web (~~ 170 TB ).
Dang no wonder they are in need of 2 TB of flash memory for a server. You can see the picture and Don MacAskill CEO of SmugMug here http://bit.ly/3HlXzH
Signal to Noise
So a few days ago I posted a Tweet that said, “signal to noise is important, not just in audio but in life“. That post was an amalgam of someone’s tweet commenting on the palaver at their job result of the amount of Tweets I was getting from one stream. I realize that the single stream is not an indictment of all who Twitter, Twitterers?
I figured I would post here what I learned from a quick study over the past week. I am following 40 streams, 32 posted something in the past week, there were a total of 522 tweets, or an average of 16 tweets over the past week. However there was one person who posted 208 Tweets in one week, the vast majority of which were very repeative and redundant. Since a picture is worth a thousand words, how much is a graph worth?
40% From one stream
In addition the person also put down an identifier so that they would trend and are getting much of the information from AlertDeck. So that person is now not being followed now. The disappointing part is that they actually have something valuable to say; they have just started adding to much noise in trying to market themselves.
So my warning is that marketing via Twitter can be done, but if there is no content everything gets turned off. Stay tuned… I might decide to reveal who the offender is.
Oh yeah, I have also decided that Apple’s iWork’s Numbers ’08 is not very powerful when it comes to collating data as I still had to do much manually instead of just doing a Pivot Table in Excel. I also still can’t activate half my applications…
Interface makes quite the difference
So I have gotten a Playstation3, I also have PS2, PS1, Nintendo Wii, Nintendo GameBoy, Windows, and Mac. So to so I have various gaming environments. So there have been some games that are available on multiple platforms and I have had a chance to try a few of them. On the PS2 I enjoy playing the SSX series of games. It is an EA Sports game and fairly fun. So when I saw it available at a reasonable price for the Nintendo Wii I figured I would try SSX Blur.
I was surprised at how different it was compared to the other versions. I realize that part of it is the change in the controller interface. The programmers I think were trying to use the advanced control options of having the accelerometers control the trick interface. For instance rather than use the controller as an analog for the board alignment, rather one has to shake the controller in a pattern to pull a trick. However it is not intuitive.To do a trick, one draws a heart with both controllers; there is one for each hand.A completely different experience than pressing square while using the D-Pad.
There are times that the accelerometers do work wonders, most often when used as an analog for another control. For example I downloaded a demo for the Playstation3 of a golf game. It was abysmal compared to the Wii Sports Golf where one uses the Wii Controller as a golf club. So it proves that it is not the controller that is flawed, but rather the application of the interface and technology.
I also downloaded a demonstration version of Civilization for the PS3. I really enjoy playing it on the computers (both Mac and PC), so I figured it could be cool on the PS3. However I was surprised at how different the experience was between using the computer screen and mouse and using the video game interface and the screen resolution. It was just not as familiar and intuitive to me. Perhaps it was the fact that I am used to something else.
I think it just goes to prove that the interface has to be adjusted to the environment that it is being used within. So the interface for a video game that one has to look at a large area, such as Civilization, having more control over the view is the key. The use of a controller for a sports game, the control use should be analogous to the way the object on screen is moving. This has not always been the case, such as why does pressing the “X” button cause the object to jump, that dissociation is easier to compensate for than drawing a heart in space makes the object flip upside down.
After all that, all that I have to say is don’t assume that the experience will always be the same as the human interface changes.