diff --git a/export/2005-05-26-say-what-you-mean.md b/export/2005-05-26-say-what-you-mean.md new file mode 100644 index 0000000..694b0f1 --- /dev/null +++ b/export/2005-05-26-say-what-you-mean.md @@ -0,0 +1,13 @@ +--- +title: "Say what you mean" +date: 2005-05-26 01:47:30 +comments: false +tags: + - "usability" +description: "While doing some DNS modifications (for this site, in fact) on AIT Domains , I received the following error message when trying to update the nameservers:" +permalink: /archives/say-what-you-mean/ +--- + +

While doing some DNS modifications (for this site, in fact) on AIT Domains, I received the following error message when trying to update the nameservers:

+

Failed to modify domain nameservers! Error: NameServer not added to easy-reader.net Invalid old value for an attribute.

+

The site’s programmers may know what that means, but I certainly don’t. That’s not the kind of error message that should be public-facing. These guys really need to read Defensive Design for the Web.

diff --git a/export/2005-05-27-trash-dom-treasure.md b/export/2005-05-27-trash-dom-treasure.md new file mode 100644 index 0000000..0ab4b1b --- /dev/null +++ b/export/2005-05-27-trash-dom-treasure.md @@ -0,0 +1,21 @@ +--- +title: "Trash + DOM = Treasure?" +date: 2005-05-27 02:42:47 +comments: true +tags: + - "(x)HTML" + - "CSS" + - "design" + - "JavaScript" +description: "I was browsing the popular links on del.icio.us today and stumbled onto Nifty Corners and (via that page) More Nifty Corners . I have to say that I am incredibly impressed with the scripting, but I fear there is something wrong with..." +permalink: /archives/trash-dom-treasure/ +--- + +

I was browsing the popular links on del.icio.us today and stumbled onto Nifty Corners and (via that page) More Nifty Corners. I have to say that I am incredibly impressed with the scripting, but I fear there is something wrong with this picture.

+

Lately, there have been some border wars over the CSS :hover pseudo-class and its forays into the behavior layer. Sure, it’s easier to have CSS do the work sometimes, but that doesn’t make it right. Frankly, I agree with the concept that behavior should be separated from presentation, just as presentation should be separated from content (which is why I use JavaScript to open and close the faux-<select> in my <select> Something New series).

+

I am also a big believer in clean, semantic markup, so I become concerned when anyone is adding superfluous code to the document to force a design issue. I know some might say I live in a glass house, but when I see someone putting code like this

+

<div id="container">
<b class="rtop">
<b class="r1"></b> <b class="r2"></b>
<b class="r3"></b> <b class="r4"></b>
</b>
<!–content goes here –>
<b class="rbottom">
<b class="r4"></b> <b class="r3"></b>
<b class="r2"></b> <b class="r1"></b>
</b>
</div>
view raw lame.html hosted with ❤ by GitHub
+

into their document (even if it is via the DOM), I begin to shudder. Maybe it’s the nagging purist in me, but that just seems wrong.

+

Are we falling into the old patterns again, forcing design issues through hacky markup? Does the use of non-semantic markup (taking a page from Eric, no doubt) make it OK? Does the fact that it’s inserted via the DOM make it any more valid? Where do we draw the line?

+

I don’t have the answer, but I think we need to have the conversation.

+

diff --git a/export/2005-05-28-i-wanna-be-a-big-player.md b/export/2005-05-28-i-wanna-be-a-big-player.md new file mode 100644 index 0000000..af3221b --- /dev/null +++ b/export/2005-05-28-i-wanna-be-a-big-player.md @@ -0,0 +1,12 @@ +--- +title: "I wanna be a big player" +date: 2005-05-28 23:04:03 +comments: false +tags: + - "business" +description: "I opened the latest issue of Baseline to find a giant 2-page spread for 1&1 (a hosting company), touting their “ Dynamic Content Catalog ” and it’s ability to give you “website content like the big players.” Basically, they are offering..." +permalink: /archives/i-wanna-be-a-big-player/ +--- + +

I opened the latest issue of Baseline to find a giant 2-page spread for 1&1 (a hosting company), touting their Dynamic Content Catalog and it’s ability to give you “website content like the big players.” Basically, they are offering to syndicate content (news, sports, games, etc.) onto your site, so you no longer have to worry about keeping your site fresh or interesting.

+

I feel like this is an attempt to reintroduce the idea that every site needs to be a portal (why that concept is still floating about I’ll never know). I also see this as as flying in the face of one of the most important business objectives: establishing a brand, voice, etc. through copywriting. If all you have to offer your clients is data with no distillation, why bother?

diff --git a/export/2005-06-06-why-intel.md b/export/2005-06-06-why-intel.md new file mode 100644 index 0000000..5ca491f --- /dev/null +++ b/export/2005-06-06-why-intel.md @@ -0,0 +1,23 @@ +--- +title: "Why Intel?" +date: 2005-06-06 21:19:40 +comments: false +tags: + - "business" + - "humor" +description: "I’ve been doing a little research lately into new laptops and I am finally starting to understand a little more about processors, etc. , so I am dumbfounded to hear that Apple is dumping IBM ’s PowerPC chips for Intel’s Pentium line..." +permalink: /archives/why-intel/ +--- + +

I’ve been doing a little research lately into new laptops and I am finally starting to understand a little more about processors, etc., so I am dumbfounded to hear that Apple is dumping IBM’s PowerPC chips for Intel’s Pentium line. From my experience, Intel chips a) run really hot and b) suffer from a severe processing bottleneck (3.2 Gigahertz with a 533 Megahertz Front-Side Bus? WTF?). It seems to me that it would have made more sense for Apple to go with AMD, they’ve got incredibly powerful chips which I understand do not suffer from these problems. Maybe there’s something I’m missing, after all, I’m not a chip guy (or a Hollywood mogul).

+

Enough about the switch, I wanted to share some humor. I love this exchange on Slashdot in reaction to the news:

+
+

Dispel any remaining doubts; we are now living in the evil mirror universe.

+
+
+

I’ll believe that when the Red Sox win the World Series!

+
+
+

Yeah, right — that’s about as likely as finding out who Deep Throat is.

+
+

You can read the whole trail if you like.

diff --git a/export/2005-06-20-standardizing-nomenclature.md b/export/2005-06-20-standardizing-nomenclature.md new file mode 100644 index 0000000..402526d --- /dev/null +++ b/export/2005-06-20-standardizing-nomenclature.md @@ -0,0 +1,16 @@ +--- +title: "Standardizing Nomenclature" +date: 2005-06-20 14:54:15 +comments: false +tags: + - "(x)HTML" + - "coding" + - "web standards" +description: "I agree with Richard : peoples’ eyes do glaze over when you say “semantic,” but they don’t have to. When Molly & I co-teach or when I am on my own, I always try to strike a balance by alternating “meaningful” and “semantic.” I feel it..." +permalink: /archives/standardizing-nomenclature/ +--- + +

I agree with Richard: peoples’ eyes do glaze over when you say “semantic,” but they don’t have to. When Molly & I co-teach or when I am on my own, I always try to strike a balance by alternating “meaningful” and “semantic.” I feel it is important that “semantic” does not go away because it does have value. That said, it is necessary to relate to your audience, no matter what their level or experience, so I think alternating the terms and showing the interchangability of the two is beneficial for everyone.

+

It is also very important to stress the difference between “structure” and “semantics.” Way too many people (myself included) have used these terms interchangably, when they are not the same. “Semantics” is about meaning whereas “structure” deals with the framework of your markup. Some say structure has only to do with your XHTML skeleton (DOCTYPE, html, head & body), but I view a page like a house. To me the “structure” is the framing upon which you build your roof, walls and floors. In XHTML, that translates not only to your document skeleton, but also to how you use divs to “frame” your content, how you use heading tags to designate content sections, etc.

+

Confusion arises in some cases when elements are both. In the case of heading tags, they are semantically meaningful (each tag conveying the relative importance of the heading it wraps in relation to the document and the other headings) and structural (forming the document outline). Additional confusion seeps in when we discuss how structural divs should be identified or classified semantically.

+

These sorts of nomenclature confusion are things we need to overcome. Our industry is still very new and we are all learning a little more every day. Sharing a common language is very important for effectively communicating (especially in our global community) and is something I think needs to be stressed even more as we move forward. I think this is yet another area where we need to establish standards and, by having discussions like this, we are taking the first steps toward establishing those.

diff --git a/export/2005-07-15-rip-dif.md b/export/2005-07-15-rip-dif.md new file mode 100644 index 0000000..6798f46 --- /dev/null +++ b/export/2005-07-15-rip-dif.md @@ -0,0 +1,13 @@ +--- +title: "RIP DiF" +date: 2005-07-15 03:18:31 +comments: true +tags: + - "business" +description: "Sad news, friends… Design In-Flight is closing shop. This very young, yet stellar PDF -based magazine was off to a fantastic start, but, as Andy put it , changes in his personal and professional life have conspired to make DiF ’s..." +permalink: /archives/rip-dif/ +--- + +

Sad news, friends… Design In-Flight is closing shop. This very young, yet stellar PDF-based magazine was off to a fantastic start, but, as Andy put it, changes in his personal and professional life have conspired to make DiF’s continued publication an impossibility.

+

Though I am disappointed at losing such a great publication, I understand where Andy’s coming from. Having spent six years of my life devoted to a magazine I started in college, I know how tough it can be. I put the fritz on hiatus in the summer of 2000 when I moved to Connecticut, hoping to restart it again as a solely web-based publication, but time has conspired to keep it only a homepage. Will I ever find the time to get it going again or is it only wishful thinking? I’m not sure. I guess time will tell.

+

Andy, best of luck to you in your future endeavors.

diff --git a/export/2005-08-02-halleluiah.md b/export/2005-08-02-halleluiah.md new file mode 100644 index 0000000..d604c21 --- /dev/null +++ b/export/2005-08-02-halleluiah.md @@ -0,0 +1,20 @@ +--- +title: "Halleluiah" +date: 2005-08-02 16:29:14 +comments: false +tags: + - "(x)HTML" + - "browsers" + - "CSS" +description: "I finally got around to reading Chris Wilson’s post about standards support in IE 7 and I have to admit I am more than a little giddy. Working for an ad agency , most clients (and the majority of my coworkers) have not gotten the whole..." +permalink: /archives/halleluiah/ +--- + +

I finally got around to reading Chris Wilson’s post about standards support in IE7 and I have to admit I am more than a little giddy. Working for an ad agency, most clients (and the majority of my coworkers) have not gotten the whole Web Standards thing, mostly because they only use IE. I can’t express how much relief I feel that IE7 is going to fall in line with most of the other browsers out there with regard to standards support.

+

Improvements of particular note are alpha transparency in PNGs and support for <abbr>s. When it comes to CSS, a whole host of bugs/issues have been fixed (mostly culled from Quirksmode and Position is Everything). I am left with two nagging questions, however:

+
    +
  1. Will the * html CSS hack still be supported in IE7 or will that be phased out so * html only affects IE6 and below?
  2. +
  3. Has the DOM interface been upgraded so we can run more powerful scripts in IE7?
  4. +
+

Overall, I think this is a huge step forward. I have to hand it to Chris, the IE team and WaSP’s Microsoft Corporation Task Force for making this all a reality. A very hearty thank you goes out to all of you, you made my year.

+

UPDATE: A recent blog entry on IE Blog has informed us that the * html selector will not be supported by IE7 in “strict” mode.

diff --git a/export/2005-08-22-dif-is-dead-long-live-dif.md b/export/2005-08-22-dif-is-dead-long-live-dif.md new file mode 100644 index 0000000..86dc37a --- /dev/null +++ b/export/2005-08-22-dif-is-dead-long-live-dif.md @@ -0,0 +1,16 @@ +--- +title: "DiF is dead. Long live DiF." +date: 2005-08-22 14:55:21 +comments: false +tags: + - "business" + - "culture & society" +description: "Just when I had gotten over the loss of Design In-Flight , the magazine relaunches as a web-only publication. From the history/about page :" +permalink: /archives/dif-is-dead-long-live-dif/ +--- + +

Just when I had gotten over the loss of Design In-Flight, the magazine relaunches as a web-only publication. From the history/about page:

+
+

With the reduction in production efforts with this format, and a slightly less rigid publishing schedule, DiF is sure not to disappear again.

+
+

I certainly hope so. Welcome back!

diff --git a/export/2005-08-26-estate-tax-thoughts.md b/export/2005-08-26-estate-tax-thoughts.md new file mode 100644 index 0000000..1a1c80b --- /dev/null +++ b/export/2005-08-26-estate-tax-thoughts.md @@ -0,0 +1,21 @@ +--- +title: "Estate Tax Thoughts" +date: 2005-08-26 17:53:01 +comments: false +tags: + - "culture & society" +description: "Congress in going back into session after their summer recess and they will be taking a vote on the Estate Tax (or “Death Tax” as some people like to call it). It is a hotly contested issue that I feel very strongly about. The truth is..." +permalink: /archives/estate-tax-thoughts/ +--- + +

Congress in going back into session after their summer recess and they will be taking a vote on the Estate Tax (or “Death Tax” as some people like to call it). It is a hotly contested issue that I feel very strongly about. The truth is that a lot of public services depend on the revenue generated by the estate tax and the number of people affected by it is less than 1.4% of the population. I should be so lucky to be wealthy enough for my children to have to pay the Estate Tax.

+

I recently wrote to my Senators and Representative to let them know how I feel and I thought I’d share it with you. Maybe you’d like to write to yours.

+
+

Dear Senators Dodd and Lieberman and Congresswoman DeLauro,

+

I am a small business owner and I support preserving the Estate Tax. I owe my life and business to the America the Estate Tax has helped build.

+

The Estate Tax provides the needed revenue to create wonderful services and opportunities for many companies. Without the internet (which the Estate Tax helped fund), I would not be able to be the successful Web Designer I am. In fact, my career path would never have been an option. Likewise, I may not have had the education to do my job—“nor my employees, theirs—“had it not been for the public school system, also funded in-part by the Estate Tax. Without a stable mail service, I would not be able to send the invoices and receive the payments my buisiness depends on. Without the infrastructure our public highways and roadways provide, I would not be able to travel to meet with clients and my business would suffer. The same goes for air travel: it would not be as safe or reliable if the Federal Government had not used tax revenues (including the Estate Tax) to make it so.

+

If I should become so wealthy that my children would even have to pay the Estate Tax, I do not feel it would be unfair for the U.S. Government to ask for a little back to repay the society that has made my business, job and lifestyle a reality. In order to ensure future generations can acheive the success that I have, we need to keep the Estate Tax.

+

Sincerely,

+

Aaron Gustafson

+
+

—–

diff --git a/export/2005-08-28-my-opinion-on-the-alas-redesign.md b/export/2005-08-28-my-opinion-on-the-alas-redesign.md new file mode 100644 index 0000000..9213919 --- /dev/null +++ b/export/2005-08-28-my-opinion-on-the-alas-redesign.md @@ -0,0 +1,12 @@ +--- +title: "My opinion on ALA’s redesign" +date: 2005-08-28 13:14:29 +comments: false +tags: + - "business" + - "design" +description: "Yeah, I’m weighing into the debate on the ALA redesign. I have to say I agree with Jon and Jeremy regarding the fixed 1024 px width. My little 2¢; to add to this discussion is that I think more designers should consider the wonderous..." +permalink: /archives/my-opinion-on-the-alas-redesign/ +--- + +

Yeah, I’m weighing into the debate on the ALA redesign. I have to say I agree with Jon and Jeremy regarding the fixed 1024px width. My little 2¢; to add to this discussion is that I think more designers should consider the wonderous world of CSS switching based on browser width (see Rammstein or the slightly better implementation on Drink-drive-lose.com’s Ad Challenge). Using this technique, users can view your site at their most comfortable screen resolution and you can still have a nicely designed page for them (fixed or liquid… or both). Can you say zoom layouts? I knew you could.

diff --git a/export/2005-09-02-death-to-bad-dom-implementations.md b/export/2005-09-02-death-to-bad-dom-implementations.md new file mode 100644 index 0000000..623cfcf --- /dev/null +++ b/export/2005-09-02-death-to-bad-dom-implementations.md @@ -0,0 +1,36 @@ +--- +title: "Death to bad DOM Implementations" +date: 2005-09-02 19:10:52 +comments: true +tags: + - "(x)HTML" + - "coding" + - "JavaScript" + - "web standards" +description: "I just encountered a DOM implementation issue in IE which took about three hours to solve (and like a year off my life). The story goes like this:" +permalink: /archives/death-to-bad-dom-implementations/ +--- + +

I just encountered a DOM implementation issue in IE which took about three hours to solve (and like a year off my life). The story goes like this:

+

I could not, for the life of me, figure out why a form submitted in Firefox was coming through perfectly while it was missing fields in IE. The form in question has some normal fields and some dynamically generated ones (if JavaScript is enabled). The normal stuff was coming through fine, but I was getting no values for the dynamically generated fields when the form was submitted in IE. I checked the $_REQUEST variable (I am using PHP) to see what was coming through, just to be sure.

+

I immediately figured it was missing name attributes, but I was using the proper syntax to create the input elements via the DOM (note: the actual JS is more generic than this)

+

var inpt = document.createElement('input');
inpt.setAttribute('name', 'company');
view raw not-in-IE.js hosted with ❤ by GitHub
+

Indeed, when I looked at the page through the Web Accessibility Toolbar’s View Generated Source, it was indeed missing the name attribute:

+

<INPUT id=company maxLength=255>
+

After about another hour or two of fruitless Google-ing, I finally typed in the magic phrase (setting the name attribute in Internet Explorer) and ended up on Bennett McElwee’s blog post of the same name. Suddenly it was all clear and (as I expected) IE’s botched implementation of the DOM’s createElement function was to blame.

+

According to the MSDN page on the name attribute (linked and quoted in the blog entry):

+
+

The NAME attribute cannot be set at run time on elements dynamically created with the createElement method. To create an element with a name attribute, include the attribute and value when using the createElement method.

+
+

It continued with the following example:

+
+

var oAnchor = document.createElement("<A NAME='AnchorName'></A>");
+

+

The script “solution” Bennett posted was somewhat of a red herring, however, as Firefox would actually execute the createElement intended for IE and end up with an element named “<input name=”company” />” which would be rendered on the page as

+

<<input name="company" /> id="company" maxlength="255" />
+

Perhaps you can see why this would be problematic.

+

I augmented Bennett’s script slightly and renamed the function createElementWithName so I wouldn’t have to use it on every element I created in the script:

+

function createElementWithName(type, name) {
var element;
// First try the IE way; if this fails then use the standard way
if (document.all) {
element =
document.createElement('< '+type+' name="'+name+'" />');
} else {
element = document.createElement(type);
element.setAttribute('name', name);
}
return element;
}
+

I am not a super fan of the reference to document.all as it feels so much like browser sniffing. I am up for suggestions to improve the function if you have any ideas.

+

Anyway, I am posting this to hopefully save someone else from the major headache I had today.

+

diff --git a/export/2005-09-07-adding-more-to-my-plate.md b/export/2005-09-07-adding-more-to-my-plate.md new file mode 100644 index 0000000..10f2cb4 --- /dev/null +++ b/export/2005-09-07-adding-more-to-my-plate.md @@ -0,0 +1,16 @@ +--- +title: "Adding more to my plate" +date: 2005-09-07 11:48:14 +comments: false +tags: + - "business" + - "presentations" + - "projects & products" +description: "It’s funny, but the more I take on, the more zen I get about work. Perhaps it’s the recent addition of a daily trek to the gym in the wee hours of the morning which is getting my day off to a better start. Or maybe it’s the Pragmatic..." +permalink: /archives/adding-more-to-my-plate/ +--- + +

It’s funny, but the more I take on, the more zen I get about work. Perhaps it’s the recent addition of a daily trek to the gym in the wee hours of the morning which is getting my day off to a better start. Or maybe it’s the Pragmatic philosophy which is beginning to take hold since finishing The Pragmatic Programmer and starting Agile Web Development with Rails. Who knows, but I am thankful for the calm.

+

So what else have I added to my already overfull plate? Well, I recently joined the staff of A List Apart as a copy editor. In fact Ross Howard’s High-Resolution Image Printing (in Issue 202) marks my editorial debut at the famed publication. I am very excited about getting to work with Erin, Jeffrey, Eric, Jason and the rest of the ALA all-stars as I have been an avid reader since I discovered it back in 2000. If you are reading an article and notice an overabundance of <abbr> and <dfn>, there’s a good chance I am to blame.

+

I am also pleased to confirm that I will be speaking at SXSW Interactive in March of 2006. At present, I am working on one session with Jeremy Keith and two other panels which are still in the formative stages. I will have more details to provide you all in the coming weeks.

+

Also to come are some great award announcements, a few more articles, and another potentially big announcement in the web standards arena. In the mean time, I am preparing for a private web standards training session down in North Carolina and next week’s trip to Silicon Valley, where Molly, Andy and I will be putting on a great 3-day training session as part of the Web Design and Project Management Tour from WOW.

diff --git a/export/2005-09-08-those-left-behind.md b/export/2005-09-08-those-left-behind.md new file mode 100644 index 0000000..5f4ba60 --- /dev/null +++ b/export/2005-09-08-those-left-behind.md @@ -0,0 +1,14 @@ +--- +title: "Those left behind" +date: 2005-09-08 00:11:30 +comments: false +tags: + - "culture & society" +description: "In the wake of the tragedy that befel the citizens of and visitors to New Orleans recently, I’ve been amazed at the amount of support and kindness being shown to the survivors ( Barbara Bush’s comments notwithstanding ). Help has come..." +permalink: /archives/those-left-behind/ +--- + +

In the wake of the tragedy that befel the citizens of and visitors to New Orleans recently, I’ve been amazed at the amount of support and kindness being shown to the survivors (Barbara Bush’s comments notwithstanding). Help has come from the likely places as well as some unlikely ones and I am sure most of you have already donated money, blood and possibly even a room or two in your house/apartment. While a good deal of assistance is still needed for our fellow humans, there are others in need too: their displaced pets.

+

I received an email from a friend in Florida who has agreed to take in two dogs that made it through OK, but there are thousands more animals that need temporary homes, be they in kennels, animal boarding houses, veterinarian’s offices, animal shelters, foster homes or rescue programs.

+

From what I understand, there are all breeds of dogs and cats in need of our help. Some are in family groups of 2, 3 and 4 while others are solo. And there are volunteers willing to drive them to you, no matter where you live. The current safe houses for these animals are being inundated and some of these pets will have to be euthanized if they are not moved to make room for the incoming animals.

+

If you are interested in taking in a dog or cat (or know someone who is), contact Lynda V. on her cell: 203-515-3024 or at home: 203-227-5308 at any time (day or night).

diff --git a/export/2005-09-20-playing-catch-up.md b/export/2005-09-20-playing-catch-up.md new file mode 100644 index 0000000..2b650b2 --- /dev/null +++ b/export/2005-09-20-playing-catch-up.md @@ -0,0 +1,15 @@ +--- +title: "Playing catch-up" +date: 2005-09-20 13:07:03 +comments: true +tags: + - "books & articles" + - "business" + - "web standards" +description: "I’ve been insanely busy building a new Rails app for a client and travelling a lot for speaking engagements. I just got back from an incredible trip to San Jose (well, Cupertino actually) where Molly , Andy and I were doing some..." +permalink: /archives/playing-catch-up/ +--- + +

I’ve been insanely busy building a new Rails app for a client and travelling a lot for speaking engagements. I just got back from an incredible trip to San Jose (well, Cupertino actually) where Molly, Andy and I were doing some training. I had an amazing time with both of them and it was really fun to see Andy in action (I, unfortunately, did not have the peasure of seeing him rock the audience at @media). We had a really great group of conference attendees too. I am a little saddened that this was my last stop on the WOW tour (I am missing Hawaii as it takes place on election day, but more on that later), but I have heard some rumblings that the show may go back on the road for a European leg. Fingers crossed.

+

Anyway, we’ve pushed a new issue of ALA out the door which includes a fantastic piece by Eric on the new ALA print stylesheet and I have a new article is in there as well: Improving Link Display for Print. It’s print mania at ALA aparently.

+

Anyway, I am apparently going to New Jersey today for work, so I need to get ready. Ta for now.

diff --git a/export/2005-09-23-a-little-random-stuff.md b/export/2005-09-23-a-little-random-stuff.md new file mode 100644 index 0000000..a224d71 --- /dev/null +++ b/export/2005-09-23-a-little-random-stuff.md @@ -0,0 +1,16 @@ +--- +title: "A little random stuff" +date: 2005-09-23 20:47:13 +comments: false +tags: + - "culture & society" +description: "I’ve had a few interesting things come through my inbox of late and I thought I would share them with you:" +permalink: /archives/a-little-random-stuff/ +--- + +

I’ve had a few interesting things come through my inbox of late and I thought I would share them with you:

+ diff --git a/export/2005-09-23-cash-and-prizes.md b/export/2005-09-23-cash-and-prizes.md new file mode 100644 index 0000000..4069c6b --- /dev/null +++ b/export/2005-09-23-cash-and-prizes.md @@ -0,0 +1,15 @@ +--- +title: "Cash and prizes" +date: 2005-09-23 12:16:53 +comments: false +tags: + - "business" + - "projects & products" +description: "With all of the travelling I was doing last week, I forgot to mention that some of the sites Dave & I have worked at Cronin and Company on took home WebAwards from the Web Marketing Association . We took home two “Outstanding Website”..." +permalink: /archives/cash-and-prizes/ +--- + +

With all of the travelling I was doing last week, I forgot to mention that some of the sites Dave & I have worked at Cronin and Company on took home WebAwards from the Web Marketing Association. We took home two “Outstanding Website” awards (think silver) for our sites for Middlesex Hospital and the Drink-Drive-Lose Ad Challenge. I was perticularly stoked as I was the designer of both sites and single-handedly built the Ad Challenge site which had a lot of complex application-level code behind it. We also took home a “Standard of Excellence” award (think bronze) for the site we built for Garelick Farms’ Over the Moon Milk product launch. I’m not a big fan of that site’s design (not because it wasn’t mine), but I love the game Dave built.

+

I was also a judge of the WebAwards (as I have been for the last few years and, no, I did not judge any sites I was involved with) and I have to say I was very pleased to see more sites moving to web standards. Out of the 15 or so I judged in the initial round, there were at least two or three that were well on their way to being exemplary in the standards world (compared to 0 only two years ago). Like most marketing-related awards, Flash always seems to trump standards, but it seems like developers are starting to cue into the standards trend more and more every year.

+

This year I also had the privilege of judging the Best of Show. I was pulling for Project Rebirth but National Geographic ended up taking home the top prize for Inside the Mafia which was also in my top 3 picks.

+

All in all, this years WebAwards were pretty good to us (and a step up from the two “Standard of Excellence” awards we pulled last year for Ride4Ever and Bertucci’s Restaurants). I look forward to next year’s competition as well as the one at SXSW, where we hope to go from finalist to winner this year.

diff --git a/export/2005-10-02-new-tutorial-westhost-on-rails.md b/export/2005-10-02-new-tutorial-westhost-on-rails.md new file mode 100644 index 0000000..095fc88 --- /dev/null +++ b/export/2005-10-02-new-tutorial-westhost-on-rails.md @@ -0,0 +1,14 @@ +--- +title: "New tutorial: Westhost on Rails" +date: 2005-10-02 15:40:20 +comments: true +tags: + - "books & articles" + - "Ruby & Rails" + - "servers" +description: "I have been hosting on Westhost for a little over four years now with no major complaints and I also host the majority of my clients there. They offer a lot of options for very little money and are always adding new features to the..." +permalink: /archives/new-tutorial-westhost-on-rails/ +--- + +

I have been hosting on Westhost for a little over four years now with no major complaints and I also host the majority of my clients there. They offer a lot of options for very little money and are always adding new features to the accounts. Unfortunately, Ruby on Rails is not one of them… yet.

+

As I have begun working a bit more with Rails, I have been looking to get it installed on my server (as well as some of my clients’). One of the major half-truths of Rails evangelism is the ease of install, especially with a host running Apache 1.3. After doing a few rather painful installs myself for some new projects, I finally decided to document the process of installing Ruby on Rails at Westhost for my own knowledge and to help any others who may be trying to do the same. Hopefully, Westhost will soon start to offer Rails installs as part of their hosting packages, but, until then, I offer up this humble tutorial.

diff --git a/export/2005-10-11-jeremy-keith-and-me.md b/export/2005-10-11-jeremy-keith-and-me.md new file mode 100644 index 0000000..42375ac --- /dev/null +++ b/export/2005-10-11-jeremy-keith-and-me.md @@ -0,0 +1,16 @@ +--- +title: "Jeremy Keith and Me" +date: 2005-10-11 16:04:11 +comments: false +tags: + - "books & articles" + - "business" + - "JavaScript" + - "presentations" + - "web standards" +description: "It managed to sneak past me for a few days, but my recent interview with Jeremy Keith has made its way into the latest issue of Digital Web Magazine . In the interview we cover the impetus behind his new book , WaSP ’s DOM Scripting..." +permalink: /archives/jeremy-keith-and-me/ +--- + +

It managed to sneak past me for a few days, but my recent interview with Jeremy Keith has made its way into the latest issue of Digital Web Magazine. In the interview we cover the impetus behind his new book, WaSP’s DOM Scripting Task Force, and Jeremy’s future as a rockstar.

+

On a somewhat tangential note, if you’re interested in competing at the SXSW Web Awards in March, I recommend getting your entires in soon. If you enter by Friday (October 14th), you only have to pay $10 per category which is too cheap to miss out on. If you attend, you’ll get more of Jeremy & me on “How to Bluff Your Way in DOM Scripting.

diff --git a/export/2005-10-12-proof-positive-that-editing-is-an-art.md b/export/2005-10-12-proof-positive-that-editing-is-an-art.md new file mode 100644 index 0000000..bc694f7 --- /dev/null +++ b/export/2005-10-12-proof-positive-that-editing-is-an-art.md @@ -0,0 +1,12 @@ +--- +title: "Proof Positive that Editing is an Art" +date: 2005-10-12 16:24:37 +comments: true +tags: + - "culture & society" + - "humor" +description: "Have you ever gone to see a movie that is nothing like the trailer ?" +permalink: /archives/proof-positive-that-editing-is-an-art/ +--- + +

Have you ever gone to see a movie that is nothing like the trailer?

diff --git a/export/2005-10-19-need-a-job.md b/export/2005-10-19-need-a-job.md new file mode 100644 index 0000000..4affbc4 --- /dev/null +++ b/export/2005-10-19-need-a-job.md @@ -0,0 +1,13 @@ +--- +title: "Need a job?" +date: 2005-10-19 17:07:27 +comments: false +tags: + - "accessibility" + - "business" + - "web standards" +description: "I’ve been informed of a great position in southern Connecticut for a strong standards designer/developer. The full-time position is at dLife.com , a site whose primary focus is providing information and support to individuals and..." +permalink: /archives/need-a-job/ +--- + +

I’ve been informed of a great position in southern Connecticut for a strong standards designer/developer. The full-time position is at dLife.com, a site whose primary focus is providing information and support to individuals and families touched by diabetes. The parent company, LifeMed Media, is forming a formidable web team and are planning some great new material for this 11,000+ page site. There’s lots of content to play with and a heavy focus on standards. If you’ve got mad CSS skills and a penchant for semantic markup, this may be up your alley. As the site’s focus is the diabetic community (and diabetes can affect your vision), accessibility knowledge is also a good skill to bring to the table.

diff --git a/export/2005-10-24-proof-you-can-find-anything-on-ebay.md b/export/2005-10-24-proof-you-can-find-anything-on-ebay.md new file mode 100644 index 0000000..457981b --- /dev/null +++ b/export/2005-10-24-proof-you-can-find-anything-on-ebay.md @@ -0,0 +1,11 @@ +--- +title: "Proof you can find anything on eBay" +date: 2005-10-24 01:37:40 +comments: false +tags: + - "culture & society" + - "humor" +permalink: /archives/proof-you-can-find-anything-on-ebay/ +--- + +

diff --git a/export/2005-10-24-san-francisco-never-looked-so-tasty.md b/export/2005-10-24-san-francisco-never-looked-so-tasty.md new file mode 100644 index 0000000..8ae4c8f --- /dev/null +++ b/export/2005-10-24-san-francisco-never-looked-so-tasty.md @@ -0,0 +1,12 @@ +--- +title: "San Francisco never looked so… tasty?" +date: 2005-10-24 12:19:55 +comments: false +tags: + - "culture & society" + - "humor" +description: "We always knew San Francisco was filled with rainbow pride, but it now seems the source was not what I expected . It reminds me of the mashed potato CN Tower scene in Canadian Bacon ." +permalink: /archives/san-francisco-never-looked-so-tasty/ +--- + +

We always knew San Francisco was filled with rainbow pride, but it now seems the source was not what I expected. It reminds me of the mashed potato CN Tower scene in Canadian Bacon.

diff --git a/export/2005-10-26-in-2030-google-became-self-aware.md b/export/2005-10-26-in-2030-google-became-self-aware.md new file mode 100644 index 0000000..590e351 --- /dev/null +++ b/export/2005-10-26-in-2030-google-became-self-aware.md @@ -0,0 +1,15 @@ +--- +title: "In 2030 Google became self-aware…" +date: 2005-10-26 20:02:27 +comments: false +tags: + - "business" + - "culture & society" + - "humor" +description: "Some interesting news on the Google front: there’s been sightings of a new Google universe which looks more than a little scary, especially in light of the mockumentary-with-a-stright-face known as EPIC 2014 ." +permalink: /archives/in-2030-google-became-self-aware/ +--- + +

Some interesting news on the Google front: there’s been sightings of a new Google universe which looks more than a little scary, especially in light of the mockumentary-with-a-stright-face known as EPIC 2014.

+

What are they planning? If its present growth and expansion continues, will Google be subject to anti-trust legislation or will it claim it isn’t because it is simply aggregating content?

+

The only thing I know for sure is that Google-rage is going to grow.

diff --git a/export/2005-10-27-debugging-javascript-just-got-a-little-bit-easier.md b/export/2005-10-27-debugging-javascript-just-got-a-little-bit-easier.md new file mode 100644 index 0000000..e1d9d09 --- /dev/null +++ b/export/2005-10-27-debugging-javascript-just-got-a-little-bit-easier.md @@ -0,0 +1,26 @@ +--- +title: "Debugging JavaScript just got a little bit easier" +date: 2005-10-27 01:59:39 +comments: true +tags: + - "business" + - "coding" + - "JavaScript" + - "projects & products" +description: "Like many of you, I’m sure, I hate debugging JavaScript. Really, it’s not the debugging, per se , as much as it’s using alert() to echo stuff out to the screen. It’s stupid and distracting and takes for ever if you’re debugging a lot of..." +permalink: /archives/debugging-javascript-just-got-a-little-bit-easier/ +--- + +

Like many of you, I’m sure, I hate debugging JavaScript. Really, it’s not the debugging, per se, as much as it’s using alert() to echo stuff out to the screen. It’s stupid and distracting and takes for ever if you’re debugging a lot of stuff.

+

For the last few months, I’ve been toying with a few different means of error reporting and echoing out debugging information, but hadn’t been really satisfied with anything I’d come up with. I used to do quite a bit of Flash work back in the day (before Dave came along and put my best efforts to shame) and always loved the Trace window. I liked that you could just echo stuff out to it and it acted as a running tally of pretty much anything you wanted to track: variable values, messages, etc. Two days ago I decided that was what I wanted for JavaScript.

+

I toyed with the idea of spawning a popup and tracing the info to that, but I don’t like popups. They are possibly more annoying than alert messages (well… maybe not). I decided to echo the messages out to a div on the page instead. Then feature creep set in. Before I knew it, it was a draggable, scalable window with some nifty features. Never one to be selfish, I thought other people could find a use for it too, so I’ve released it for anyone who wants it: here it is. Use it, play with it and improve on it as you see fit.

+

The script currently has the following features:

+ +

Special thanks go out to Aaron Boodman, whose DOM Drag was perfect for the dragging and enabled me to hook up a window stretcher pretty easily, Richard Rutter, whose Browser Stickies were also somewhat of an inspiration, and Dave, for helping me debug the scaling code.

+

Aside: one nice feature of the script is that, once it was operational, I was able to use it to debug itself… how cool is that?

diff --git a/export/2005-10-30-jstrace-two-days-on.md b/export/2005-10-30-jstrace-two-days-on.md new file mode 100644 index 0000000..8cc27e0 --- /dev/null +++ b/export/2005-10-30-jstrace-two-days-on.md @@ -0,0 +1,18 @@ +--- +title: "jsTrace two days on" +date: 2005-10-30 01:04:04 +comments: true +tags: + - "business" + - "coding" + - "JavaScript" + - "projects & products" +description: "The reception for our latest script release , jsTrace , has been fantastic. From the write-up on the DOM Scripting Task Force blog to all of the emails and comments, it’s been great." +permalink: /archives/jstrace-two-days-on/ +--- + +

The reception for our latest script release, jsTrace, has been fantastic. From the write-up on the DOM Scripting Task Force blog to all of the emails and comments, it’s been great.

+

The past few days have seen many ideas, requests and enhancements sent my way. Some have been rolled into the jsTrace 1.1 release which I made public today. One such enhancement (brought to us by Joe Shelby) I have dubbed “memory,” as it allows the debugging window to remember both its position and size the next time it is opened (via cookies). Further enhancements have been made to the underlying code to streamline development of additional tools for the bottom toolbar and the font size of the bottom toolbar has also been increased (per several requests).

+

I hope you all enjoy the improvements. Keep ‘em coming.

+

Update: We’ve also been mentioned in DOMScripting.com.

+

Another update (to 1.2): I added a buffer to handle traces executed prior to the jsTrace window being generated. The buffer is written to the viewport once it is generated.

diff --git a/export/2005-10-31-spreading-the-praise.md b/export/2005-10-31-spreading-the-praise.md new file mode 100644 index 0000000..bd71771 --- /dev/null +++ b/export/2005-10-31-spreading-the-praise.md @@ -0,0 +1,22 @@ +--- +title: "Spreading the praise" +date: 2005-10-31 15:11:28 +comments: false +tags: + - "business" + - "usability" + - "web standards" +description: "In his most recent essay , Gerry McGovern was discussing expert opinions and voices. One particular comment he made struck a chord:" +permalink: /archives/spreading-the-praise/ +--- + +

In his most recent essay, Gerry McGovern was discussing expert opinions and voices. One particular comment he made struck a chord:

+
+

The Web is maturing. It needs more people like Jakob Nielsen who propose, explain and defend rules.

+
+

Now you can say what you will about Jakob, but I think the sentiment is right. I also think that priase needs to be spread a little farther to include Molly, Eric, Jeffrey and the countless other standards evangelists (both internationally renown and sitting in the cube next to you) who feel it is their calling to enforce the “rules” of the web. These are people who truly believe, as I do, that constraints are necessary for creativity. Jason Fried mentioned something similar in his discussion of Basecamp (and I am probably paraphrasing):

+
+

Limited time, limited people, limited funding… they make you creative

+
+

I think the same could be said for embracing web standards. I mean look at the Zen Garden, the Web Standards Awards, etc. There is some amazingly creative work out there that embraces the “restrictions” of web standards. Frankly, I think that web standards are the main reason DOM scripting (and all that comes with it) has been able to flourish: standards ensure a solid platform upon which to build anything. Their constraints free you to get creative and really make something new.

+

So let’s hear it for them: a round of applause for all of the standards evangelists out there. Keep up the great work, we appreciate all that you do.

diff --git a/export/2005-11-02-more-developments-in-jstrace.md b/export/2005-11-02-more-developments-in-jstrace.md new file mode 100644 index 0000000..7a75967 --- /dev/null +++ b/export/2005-11-02-more-developments-in-jstrace.md @@ -0,0 +1,15 @@ +--- +title: "More developments in jsTrace" +date: 2005-11-02 04:31:24 +comments: true +tags: + - "business" + - "coding" + - "JavaScript" + - "projects & products" +description: "As I mentioned to Ian earlier today, Dave and I were discussing having the jsTrace window keep pace with whatever the most current line is spit out to it. A few hours later, here it is: jsTrace 1.3 . I have some other stuff (read..." +permalink: /archives/more-developments-in-jstrace/ +--- + +

As I mentioned to Ian earlier today, Dave and I were discussing having the jsTrace window keep pace with whatever the most current line is spit out to it. A few hours later, here it is: jsTrace 1.3. I have some other stuff (read: paying projects) that need my attention, so I am putting jsTrace down for a bit. Dave & I will be posting a few more demos of its use in different situations, but as far as further development goes, I’m gonna be hands-off for a bit to let you all get a chance to participate.

+

And if you’re in the participatory mood, check out this site I built with Adaptive Path. I will be posting some details about the project and how I accomplished certain design features once Kel’s campaign’s over and life gets a little less hectic.

diff --git a/export/2005-11-03-disneystorecouk-in-retrograde.md b/export/2005-11-03-disneystorecouk-in-retrograde.md new file mode 100644 index 0000000..3b72823 --- /dev/null +++ b/export/2005-11-03-disneystorecouk-in-retrograde.md @@ -0,0 +1,15 @@ +--- +title: "DisneyStore.co.uk in retrograde" +date: 2005-11-03 11:08:29 +comments: false +tags: + - "business" + - "web standards" +description: "Andy’s beautiful standards-based design is gone, replaced by a table-based pile of (ahem) tag soup. Being the gentleman that he is, Andy has chosen to remain silent on the issue , but he has provided a forum for anyone else who wants to..." +permalink: /archives/disneystorecouk-in-retrograde/ +--- + +

Andy’s beautiful standards-based design is gone, replaced by a table-based pile of (ahem) tag soup. Being the gentleman that he is, Andy has chosen to remain silent on the issue, but he has provided a forum for anyone else who wants to chime in. Molly posted an open letter to Disney which I think sums the whole thing up quite well:

+
+

Shame on you Disney.

+
diff --git a/export/2005-11-03-oracle-opens-up.md b/export/2005-11-03-oracle-opens-up.md new file mode 100644 index 0000000..a76fdf6 --- /dev/null +++ b/export/2005-11-03-oracle-opens-up.md @@ -0,0 +1,22 @@ +--- +title: "Oracle opens up" +date: 2005-11-03 15:25:17 +comments: false +tags: + - "business" + - "databases" +description: "In hopes of stemming the massive explosion of open source database use, Oracle is preparing an “express” version of it’s Oracle Database 10g line: Oracle Database XE . Like many things on the web right now, it’s currently in beta, with..." +permalink: /archives/oracle-opens-up/ +--- + +

In hopes of stemming the massive explosion of open source database use, Oracle is preparing an “express” version of it’s Oracle Database 10g line: Oracle Database XE. Like many things on the web right now, it’s currently in beta, with a full release planned for late this year.

+

Courting the open source set is an interesting move for Oracle. PHP developers are the obvious target right now, but I wouldn’t be surprised to see some of the focus shifted to Rails developers in the near future.

+

Here’s a breakdown of some of the features/limitations:

+ +

I haven’t downloaded it to play yet, but there seems to be some fairly detailed instructions on both install and integration on the PHP end.

diff --git a/export/2005-11-05-what-are-they-thinking.md b/export/2005-11-05-what-are-they-thinking.md new file mode 100644 index 0000000..7118a1c --- /dev/null +++ b/export/2005-11-05-what-are-they-thinking.md @@ -0,0 +1,12 @@ +--- +title: "What are they thinking?" +date: 2005-11-05 21:12:02 +comments: false +tags: + - "business" + - "culture & society" +description: "I knew things had taken a turn for the worse when DMCA passed and media owners were discussing their desire to spy on consumers’ computers in search of illicit media, but who knew it had gotten this bad ?" +permalink: /archives/what-are-they-thinking/ +--- + +

I knew things had taken a turn for the worse when DMCA passed and media owners were discussing their desire to spy on consumers’ computers in search of illicit media, but who knew it had gotten this bad?

diff --git a/export/2005-11-07-wait-for-it.md b/export/2005-11-07-wait-for-it.md new file mode 100644 index 0000000..e670e95 --- /dev/null +++ b/export/2005-11-07-wait-for-it.md @@ -0,0 +1,12 @@ +--- +title: "Wait for it…" +date: 2005-11-07 10:58:21 +comments: false +tags: + - "humor" +description: "A fantastic new t-shirt from J! NX ." +permalink: /archives/wait-for-it/ +--- + +

T-shirt reads &#8220;[please wait while image loads]&#8221;

+

A fantastic new t-shirt from J!NX.

diff --git a/export/2005-11-11-e-voting-comes-to-ct.md b/export/2005-11-11-e-voting-comes-to-ct.md new file mode 100644 index 0000000..992b353 --- /dev/null +++ b/export/2005-11-11-e-voting-comes-to-ct.md @@ -0,0 +1,21 @@ +--- +title: "e-Voting comes to CT" +date: 2005-11-11 15:22:26 +comments: false +tags: + - "accessibility" + - "culture & society" + - "usability" +description: "Here in Connecticut, the Secretary of State is getting ready to purchase new computerized voting machines and is doing an exhibition of the different options throughout the state’s five Congressional Districts. As computerized voting is..." +permalink: /archives/e-voting-comes-to-ct/ +--- + +

Here in Connecticut, the Secretary of State is getting ready to purchase new computerized voting machines and is doing an exhibition of the different options throughout the state’s five Congressional Districts. As computerized voting is such a hot topic right now, I highly recommend anyone and everyone who lives in my state go to one of the exhibitions and offer some sort of public comment. We need to ensure we get safe voting machines that actually record what we intend them to.

+

Here are the dates:

+ diff --git a/export/2005-11-15-e-voting-options-for-ct-looking-bleak.md b/export/2005-11-15-e-voting-options-for-ct-looking-bleak.md new file mode 100644 index 0000000..a907a6b --- /dev/null +++ b/export/2005-11-15-e-voting-options-for-ct-looking-bleak.md @@ -0,0 +1,30 @@ +--- +title: "e-Voting options for CT looking bleak" +date: 2005-11-15 22:26:20 +comments: false +tags: + - "culture & society" + - "usability" +description: "All this week, the Secretary of State’s office is offering demonstrations of and soliciting public comment on the three finalists for Connecticut’s electronic voting machines. The line was long at Monday’s demonstration at Buckland..." +permalink: /archives/e-voting-options-for-ct-looking-bleak/ +--- + +

All this week, the Secretary of State’s office is offering demonstrations of and soliciting public comment on the three finalists for Connecticut’s electronic voting machines. The line was long at Monday’s demonstration at Buckland Hills Mall in Manchester (easily an hour wait), but well worth it to see what we’re in for when we return to the polls next year.

+

The following is a breakdown of the three machines being demonstrated and the pros and cons of each.

+

Machine 1: Diebold AccuVote TS-X

+

The Diebold machine This voting machine looks like an ATM, which is not surprising given Diebold’s involvement in that market. According to the literature, to use the machine, you take the plastic card—“which looks like a basic ATM or credit card—“given to you by a poll worker, and insert it into the machine. You then use the touch screen to make your choices for each race. When you are nearing completion, you are shown a summary screen which you review and then choose to cast if it looks good. You can touch individual races or “Review Ballot—? to make changes. I have no idea how legible the text on the screen is or how easy to operate this machine is as when I was observing it, they were only showing the “Insert your card—? screen.

+

I also did not get to see the voter verified paper trail add-on in action, but it consisted of a little slot with a magnifying plastic door over it in the lower right-hand corner of the machine, where one would assume the paper “receipt—? drops. The plastic flap was open when I was viewing the machine, so I am uncertain as to whether the slip is removed by a person or not.

+

This machine does offer a set of headphones for someone with disabilities and a numeric keypad which I imagine could be used for making selections, but I was amazed to find that the keypad did not have any braille markings on it to indicate the number. I realize not all blind or visually-impaired people can read braille, but something as basic as the numbers zero through nine can be easily picked up and would greatly improve the usability of this device.

+

Machine 2: Danaher (Guardian) ELECTronic 1242

+

Voting on the Danaher ELECTronic 1242 This electronic voting machine is not exactly what I imagine when think electronic voting system. It is electronic, so I suppose it qualifies, but it feels more like playing a game of Battleship or Operation than a casting a vote during an election.

+

The machine consists of a giant light-up board covered by a large sheet of paper with all of the offices and candidates on it. Each race has a blinking red light associated with it. The voter’s job is to extinguish each red light by pressing on the numbered box corresponding to the candidate they choose. A voter can change his or her mind by clicking the box corresponding to the previously chosen candidate to de-select him or her and then make a new choice. When voting is complete, the voter pushes the large, green “VOTE—? button at the bottom which casts the voter’s virtual ballot.

+

Personally, I found the “interface—? a little awkward to use as seeing the red lights through the paper was not necessarily easy. I also wonder how easy it would be for someone with macular degeneration or other visual impairment, short of blindness. As for the blind, it appears that they are out of luck with this machine. I couldn’t find a single accessibility feature apart from it being able to be used by someone in a wheelchair (and I question whether the text at the very top of the paper, giving instructions on the machine’s use are actually readable from a height of 3-4 feet). I also failed to see a paper trail on this machine, but perhaps I just missed it.

+

Machine 3: Avante Vote-Trakker EVC308-SPR

+

Voting on the Vote-Trakker This was the most promising of the three voting machines exhibited, but it too had its drawbacks. The interface was a touch screen, but the arrangement of the races and questions demonstrated was anything but intuitive (obviously a byproduct of developers “designing—?). One of the better features of this machine was the ability to “zoom in—? on a question or race so that it was essentially all you saw on the screen. The write-in process was a little clunky, requiring the use of a keyboard with no real place to store it for easy access. And who wants to be bothered with juggling a keyboard while trying to vote?

+

Vote-Trakker Paper Trail & Lockbox The most interesting feature of this machine was its paper trail. Each vote cast has a unique serial number (which is untraceable to an individual voter, of course), and when you are almost ready to cast your ballot, a paper tally of your votes is displayed in the box next to the machine for you to look at. If it is correct, you cast your votes via the interface. If it is incorrect, you can return to the voting system and change your choices (or alert a poll worker to a discrepancy if what you see on the screen is not what is on the paper) before viewing a new paper “receipt.—? All of the paper receipts are kept in the locked box which displays them in the event of a recount.
This machine is pretty accessible to mobility and visually impaired individuals although those with complete blindness would still require some assistance to cast their ballot.

+

Final thoughts

+

I am a little disappointed in the range of devices being offered to Connecticut. Perhaps this is a reflection of the poor quality of voting systems available or the poor turnout in the response to the state’s RFP.

+

My main concern stems from the fact that our current voting machines in Connecticut were originally designed in the late 1800’s (though they were, I imagine, built in the ’40s or ’50s) and we are still using them. Granted, they are mechanical, and changes to a mechanical device are a little more difficult than software upgrades. Still, it is likely that we will have these electronic machines for many decades to come, so we should have the best machines we can.

+

And this doesn’t even begin to address the potential issues with the underlying software incorrectly recording (or failing to record) votes. I really wish that the companies making the software that powers these machines would have to make their source code available to the public so experts in the field could examine and suggest improvements to the algorithms that will (inevitably) power our democracy.

+

If you are interested in seeing these machines first-hand and offering your comments to the Secretary of State’s office, please attend one of the remaining exhibitions.

+

—–

diff --git a/export/2005-11-18-another-political-divide.md b/export/2005-11-18-another-political-divide.md new file mode 100644 index 0000000..a2ffab3 --- /dev/null +++ b/export/2005-11-18-another-political-divide.md @@ -0,0 +1,14 @@ +--- +title: "Another political divide" +date: 2005-11-18 10:54:04 +comments: false +tags: + - "culture & society" +description: "In the interest of observing politics and activism online, I have signed up to receive numerous newsletters and “action alerts” from groups ranging from MoveOn and the ACLU to the GOP . Politics aside, one thing I find very striking is..." +permalink: /archives/another-political-divide/ +--- + +

In the interest of observing politics and activism online, I have signed up to receive numerous newsletters and “action alerts” from groups ranging from MoveOn and the ACLU to the GOP. Politics aside, one thing I find very striking is how much freedom is given to an individual to add a personal message or rewrite a letter in support of or against a particular issue by the “left.” The “right,” however, does not seem to have any interest in their activists’ opinions.

+

When I received this recent solicitation from the GOP, for instance, I was not given any opportunity to rewrite the letter in my own words or even add a personal message to the missive. All the GOP seems to want is my signature and the email addresses of my friends. Even if I were to agree with an issue the GOP were asking for my support on, a practice like that makes me very disinclined to take action.

+

On the other “side,” you have action requests such as this petition from MoveOn. Much more emphasis is placed on personalizing the message of the action. Even if I do not have the time to add a personal message, I appreciate the effort to include me in the process and am more likely to take action.

+

Perhaps it is just my conspiracy-addled mind working overtime, but I find this dichotomy more than a little odd. What does the GOP have to fear from their own activists?

diff --git a/export/2005-11-29-daves-work-draws-a-crowd.md b/export/2005-11-29-daves-work-draws-a-crowd.md new file mode 100644 index 0000000..0e65779 --- /dev/null +++ b/export/2005-11-29-daves-work-draws-a-crowd.md @@ -0,0 +1,18 @@ +--- +title: "Dave’s Work Draws a Crowd" +date: 2005-11-29 20:59:42 +comments: false +tags: + - "business" + - "Flash & ActionScript" + - "projects & products" +description: "I just saw a copy of the latest issue of DMNews and Dave’s hard work garnered the Wadsworth Atheneum a feature story and Cronin and Company some major kudos. Here’s an excerpt from the article:" +permalink: /archives/daves-work-draws-a-crowd/ +--- + +

I just saw a copy of the latest issue of DMNews and Dave’s hard work garnered the Wadsworth Atheneum a feature story and Cronin and Company some major kudos. Here’s an excerpt from the article:

+
+

An online campaign initiated by the 161-year-old museum and developed by Glastonbury, CT, ad agency Cronin and Company Inc. doubled visits to the site at www.wadsworthatheneum.org. Components included a SurrealPainter Web tool, banner ads and the seeding of blogs.

+

Central to the campaign is the tool at www.wadsworthatheneum.org/painter. Visitors through Dec. 18 can choose from various colorful backgrounds and objects, then flip, copy, layer or scale them. Once completed, the online artwork can be titled, printed, published and e-mailed to family and friends.

+
+

Be sure to make your own surrealist painting, while you still can.

diff --git a/export/2005-11-29-savvy-marketers-take-note.md b/export/2005-11-29-savvy-marketers-take-note.md new file mode 100644 index 0000000..9f619ea --- /dev/null +++ b/export/2005-11-29-savvy-marketers-take-note.md @@ -0,0 +1,29 @@ +--- +title: "Savvy Marketers Take Note" +date: 2005-11-29 18:14:07 +comments: false +tags: + - "business" + - "coding" + - "search engine optimization" + - "web standards" +description: "One of the web’s preeminent marketing websites, MarketingSherpa , has just published an article which may start a web standards stampede . The focus is Firefox , but the underlying message is standards, standards and more standards:" +permalink: /archives/savvy-marketers-take-note/ +--- + +

One of the web’s preeminent marketing websites, MarketingSherpa, has just published an article which may start a web standards stampede. The focus is Firefox, but the underlying message is standards, standards and more standards:

+
+

Your more savvy Web designers are likely all agog over Firefox, because its support of Web Standards makes it easier to design and maintain effective Web pages. …

+

For marketers, these standards are so darn important because they affect the bottom line: your budget. In fact, The Web Standards Project … estimates that before today’s growing lack of support for standards, the “fractured browser market” was adding at least 25% to the cost of developing Web sites. And that’s just one tiny piece of the revenue picture.

+

Housing construction, electrical wiring, automobile design, all these benefit from design standards,” says Scott McDaniel, MarketingSherpa’s own Internet Director. “Web site construction is maturing in much the same way.

+
+

The author, Heidi Anderson, even tallies her “6 Business Benefits of ‘Web Standards-based’ design”:

+
    +
  1. Increased search engine optimization
  2. +
  3. Proper content presentation, including shopping carts and “contact us” forms
  4. +
  5. Decreased development and maintenance costs
  6. +
  7. Lower bandwidth usage
  8. +
  9. Faster download times
  10. +
  11. Web viewing beyond the computer (your site on wireless devices & RSS)
  12. +
+

It’s nice to see marketers taking note of all we can do for them. Now roll up your shirt sleeves and get to work.

diff --git a/export/2005-12-04-gartner-wants-you.md b/export/2005-12-04-gartner-wants-you.md new file mode 100644 index 0000000..a2d9134 --- /dev/null +++ b/export/2005-12-04-gartner-wants-you.md @@ -0,0 +1,22 @@ +--- +title: "Gartner wants you" +date: 2005-12-04 18:30:33 +comments: false +tags: + - "business" + - "web standards" +description: "My friends at Gartner are hiring a new member for their web team. If you are within commuting distance of Stamford, CT (about 45 minutes outside of NYC by train), I highly recommend considering this job as I know first-hand how awesome..." +permalink: /archives/gartner-wants-you/ +--- + +

My friends at Gartner are hiring a new member for their web team. If you are within commuting distance of Stamford, CT (about 45 minutes outside of NYC by train), I highly recommend considering this job as I know first-hand how awesome this team is to work with.

+

You’ll need the following areas of knowledge (in addition to a thirst for more):

+ +

If you’re interested, email my friend Aidan (aidan [dot] brewer [at] gartner [dot] com) and let him know I sent you.

diff --git a/export/2005-12-06-karova-redesigns.md b/export/2005-12-06-karova-redesigns.md new file mode 100644 index 0000000..80128dd --- /dev/null +++ b/export/2005-12-06-karova-redesigns.md @@ -0,0 +1,17 @@ +--- +title: "Karova redesigns" +date: 2005-12-06 21:05:22 +comments: false +tags: + - "(x)HTML" + - "business" + - "coding" + - "CSS" + - "design" + - "web standards" +description: "That beautiful bastion of standards-based e-commerce, Karova , has gotten a face lift. Mr. Malarkey deserves many kudos for yet another rich, engaging and playful design. And if you think the sales materials look good , you should see..." +permalink: /archives/karova-redesigns/ +--- + +

That beautiful bastion of standards-based e-commerce, Karova, has gotten a face lift. Mr. Malarkey deserves many kudos for yet another rich, engaging and playful design. And if you think the sales materials look good, you should see the store management dashboard. I was offered a sneak peek and couldn’t help but fawn over its sophisticated simplicity. It’s not only usable, but it makes managing a web shop (dare I say it) kinda fun. For a little background on the redesign, read Andy’s writeup.

+

Now if only we could find a suitable partner to bring their product to the US (hint, hint).

diff --git a/export/2005-12-19-holiday-greetings-games.md b/export/2005-12-19-holiday-greetings-games.md new file mode 100644 index 0000000..b2fc718 --- /dev/null +++ b/export/2005-12-19-holiday-greetings-games.md @@ -0,0 +1,19 @@ +--- +title: "Holiday Greetings & Games" +date: 2005-12-19 11:20:14 +comments: true +tags: + - "business" + - "projects & products" +description: "This has been one crazy Fall work-wise, so I apologize for the scarcity of posts, but I do have a few holiday treats for you." +permalink: /archives/holiday-greetings-games/ +--- + +

This has been one crazy Fall work-wise, so I apologize for the scarcity of posts, but I do have a few holiday treats for you.

+

From my day job at Cronin and Company, we’ve got Cronin’s “Grab Bag of Goodness.” As with most internal projects, this was a major rush job. I take no credit for the design (which was handed to me with no wiggle room), but when it comes to the CSS and DOM Scripting, that I’ll proudly take credit for. Use the code “9301″ to get in. Of particular note in this piece:

+ +

Then there’s the Easy Designs holiday card. I will spare the commentary on this one with the exception of giving major props to Dave for building the game in a day. I’m pretty darn proud of it, especially since we pretty much went from concept to execution in a matter of days (yeah, procrastination’s a bitch). If you’re interested, you can see a rough approximation of the email that went out (our first Campaign Monitor mailing) or simply play the game.

diff --git a/export/2006-01-09-got-ajax-skills-odeo-beckons.md b/export/2006-01-09-got-ajax-skills-odeo-beckons.md new file mode 100644 index 0000000..0a50769 --- /dev/null +++ b/export/2006-01-09-got-ajax-skills-odeo-beckons.md @@ -0,0 +1,11 @@ +--- +title: "Got AJAX Skills? Odeo beckons" +date: 2006-01-09 15:21:02 +comments: false +tags: + - "business" +description: "The fine folks over at Odeo are looking for an “ AJAX Engineer ” to round out their web dev team. If you live & breathe JavaScript, CSS , XHTML and live in or around San Francisco, drop them a line (jobs [at] odeo [dot] com). Everyone I..." +permalink: /archives/got-ajax-skills-odeo-beckons/ +--- + +

The fine folks over at Odeo are looking for an “AJAX Engineer” to round out their web dev team. If you live & breathe JavaScript, CSS, XHTML and live in or around San Francisco, drop them a line (jobs [at] odeo [dot] com). Everyone I know that works there seems to love it, making me wish I lived a little closer to SF.

diff --git a/export/2006-01-09-repetition-and-replacement.md b/export/2006-01-09-repetition-and-replacement.md new file mode 100644 index 0000000..1ace6e4 --- /dev/null +++ b/export/2006-01-09-repetition-and-replacement.md @@ -0,0 +1,17 @@ +--- +title: "Repetition and Replacement" +date: 2006-01-09 15:26:31 +comments: true +tags: + - "(x)HTML" + - "books & articles" + - "coding" + - "CSS" + - "design" + - "web standards" +description: "While working on a new site for a client, I stumbled upon an application of the Leahy / Langridge method of image replacement… to images. As far as I know, it had not been attempted before and, frankly, I was a little amazed it worked." +permalink: /archives/repetition-and-replacement/ +--- + +

While working on a new site for a client, I stumbled upon an application of the Leahy/Langridge method of image replacement… to images. As far as I know, it had not been attempted before and, frankly, I was a little amazed it worked.

+

The technique, which I am calling iIR for “img Image Replacement” (a bit of a mouthful, I know), helps you keep you code leaner and meaner without sacrificing stylability or accessibility. You can read the article on the Easy Designs and feel free to drop your comments below. Maybe you can think of a better name for it too.

diff --git a/export/2006-01-11-browse-flickr-with-the-stroke-of-a-pen.md b/export/2006-01-11-browse-flickr-with-the-stroke-of-a-pen.md new file mode 100644 index 0000000..c6a46e7 --- /dev/null +++ b/export/2006-01-11-browse-flickr-with-the-stroke-of-a-pen.md @@ -0,0 +1,11 @@ +--- +title: "Browse Flickr with the stroke of a pen" +date: 2006-01-11 18:10:11 +comments: false +tags: + - "culture & society" +description: "This is perhaps the coolest (albeit experimental) way to browse Flickr : retrievr from System One Labs ." +permalink: /archives/browse-flickr-with-the-stroke-of-a-pen/ +--- + +

This is perhaps the coolest (albeit experimental) way to browse Flickr: retrievr from System One Labs.

diff --git a/export/2006-01-16-consumer-choice-and-fair-use.md b/export/2006-01-16-consumer-choice-and-fair-use.md new file mode 100644 index 0000000..6c458b0 --- /dev/null +++ b/export/2006-01-16-consumer-choice-and-fair-use.md @@ -0,0 +1,18 @@ +--- +title: "Consumer Choice and Fair Use" +date: 2006-01-16 18:09:51 +comments: false +tags: + - "business" + - "internationalization & localization" +description: "In a recent issue of Game Informer , I read an interesting news piece on the upcoming PS 3 , but its significance goes far beyond that system and even the world of video games. In fact, it applies to all digital media." +permalink: /archives/consumer-choice-and-fair-use/ +--- + +

In a recent issue of Game Informer, I read an interesting news piece on the upcoming PS3, but its significance goes far beyond that system and even the world of video games. In fact, it applies to all digital media.

+

It seems SCE Australia recently lost a court case involving the use of mod chips to play foreign titles. Current video game systems (and indeed the DVD movie industry) use region encoding to keep certain movies and games out of certain areas, but the court ruled region encoding was “an articficial trade barrier that restricted consumers’ choice.” The ruling impacts only those Aussie gamers who wish to play worldwide games and the descision was obviously made in observance of Australia’s copyright laws, but why did it take a court ruling to make it so? It seems like a no-brainer to me.

+

Of course, mod-ing does still void your warranty, but once that’s up, what skin is it off the video game industry’s teeth if you play a foreign game? They still get paid. It’s not like you’re stealing from anyone. The same should go for movies. Why should I not be able to buy a copy of Delicatessen on DVD simply because I live in the US? I bought a copy on Laser Disk back in the day and the DVD is available in Europe. Granted, I’ve got the damn PAL/NTSC thing to worry about, but really, why do we need region encoding? There’s no such thing for CDs and it works out great for everyone. We can listen to music from anywhere and there is very big money in it for record shops that stock import CDs (at least in the US, most imports run upwards of $30 for a full-length CD). Why shouldn’t the same go for movies and video games?

+

Alright, so I’ve probably beaten that horse to death now. On to the second interesting little factoid in the article… the one that really scares me: Sony has apparently developed a technology which could be used to stop an individual from playing used games. The technology, developed by PlayStation creator Ken Kutaragi, would encrypt an authentication code on the disk, making it playable only on the first system it is played on. There’s no word on whether this technology will be employed in the PS3, but the sheer fact that something like this has been developed is absurd.

+

An argument against this technology could probably be made on the grounds of Fair Use here in the US. After all, if you own two of the same video game systems (Perhaps your parents are divorced and you have an XBox at each parent’s house or you’re really lazy and have one upstairs and one downstairs. I don’t know, work with me here…), why should you not be able to play the game you paid $50 for on both? Based on the Australian ruling and its focus on consumer choice, I imagine the Aussies would probably kill it as well, but did anyone stop to think of the repercussions such a technology would have?

+

Such technology has the power to kill the video game rental industry as well as friendly borrowing, both of which I am sure trigger a good portion of new video game sales to begin with. So not only would it crush the aftermarket (used video game sales which, one assumes, is the target), but it runs the risk of killing the market as well. Then there’s the environmental impact. Think about it: millions of games (and their packaging) rendered useless once they’re done being played. That’s as good an idea as those disposable DVDs Disney came up with. And, of course, not everyone can afford to plunk down $40-50 for a brand new game, so it would likely cut the available market considerably. Do people even think about this shit?

+

OK, perhaps I’m being just a tad alarmist here. Surely a mod would be available within weeks if not days to disable such “protection,” but I just wish people would think about the consequences of their creations before building them at all.

diff --git a/export/2006-01-18-now-thats-what-i-love-to-hear.md b/export/2006-01-18-now-thats-what-i-love-to-hear.md new file mode 100644 index 0000000..b46baa5 --- /dev/null +++ b/export/2006-01-18-now-thats-what-i-love-to-hear.md @@ -0,0 +1,21 @@ +--- +title: "Now that’s what I love to hear" +date: 2006-01-18 18:07:40 +comments: false +tags: + - "coding" + - "JavaScript" + - "projects & products" +description: "I got an email the other day from Steven Mading, a developer at the BioMagnetic Resonance Bank at the University of Wisconsin . In it, he shared his experience using jsTrace and, with his permission, I’m sharing it with all of you:" +permalink: /archives/now-thats-what-i-love-to-hear/ +--- + +

I got an email the other day from Steven Mading, a developer at the BioMagnetic Resonance Bank at the University of Wisconsin. In it, he shared his experience using jsTrace and, with his permission, I’m sharing it with all of you:

+
+

I just thought I’d give a quick thank you to you for the little jsTrace JavaScript utility you made available online. I found it from a Google search and it was exactly what I needed.

+

It really helped me a lot. I had a problem with some widgets on an HTML form that had a lot of JavaScript hooks (things like onblur, onclick, onfocus, etc). The events were occurring in a weird order and I couldn’t trace what was happening. Using the standard alert() function was useless because making an alert window POP up caused the events to be different and changed the relevant behavior (since onfocus and onblur were a relevant part of the behavior, popping up a window changes the focus and invalidates the debugging information when what I’m trying to do is figure out why the focus changes aren’t happening the way I expect.)

+

Your jsTrace allowed me to figure out the problem (which, as it turns out, was that when I clicked on Widget B, I was calling BOTH the onclick for Widget B and the onblur for Widget A, but not always in a predictable order). So once I knew that was happening, I was able to redesign my code to work either way and thus fix the bug.

+

Again, thank you for making this tool publicly available.

+
+

I love it when things work out like that. It makes it all worthwhile.

+

Have you had an experience with using jsTrace that you’d like to share? Do you use it or any other scripts we’ve built often? Are any of the user enhancement scripts in use on production websites? Let us know your thoughts, good or bad.

diff --git a/export/2006-02-02-a-load-of-malarkey.md b/export/2006-02-02-a-load-of-malarkey.md new file mode 100644 index 0000000..c500853 --- /dev/null +++ b/export/2006-02-02-a-load-of-malarkey.md @@ -0,0 +1,20 @@ +--- +title: "A Load of Malarkey" +date: 2006-02-02 00:18:04 +comments: true +tags: + - "browsers" + - "CSS" + - "web standards" +description: "Microsoft released Internet Explorer 7 Beta 2 for public consumption yesterday. Based on everything I’d been reading, the development team seemed to be moving in the right direction . I decided to take it for a test drive to see how..." +permalink: /archives/a-load-of-malarkey/ +--- + +

Microsoft released Internet Explorer 7 Beta 2 for public consumption yesterday. Based on everything I’d been reading, the development team seemed to be moving in the right direction. I decided to take it for a test drive to see how things were coming along.

+

A screenshot of And All That Malarkey in Internet Explorer 7 Beta 2

+

The answer is “not well” I’m afraid. I took my first journey in the new browser over to my friend Andy’s house. I did this mainly because I love how Andy handles IE. It’s beautiful. I wanted to see if IE7 did the right thing and got the real design for And All That Malarkey. As it began rendering, my heart was racing and I was ecstatic to see the beautiful blues and reds of Andy’s blog coming through. Then, something a little odd happened. Somehow IE7 missed the boat and the page was rendered virtually unreadable (except the latest articles section) because a bunch of And All That Malarkey logos kept popping up everywhere.

+

Now according to Chris Wilson over at Microsoft (who’s team has been working closely with the wonderful folks at WaSP to make IE7 standards-compliant):

+
+

Beta 1 makes little progress for web developers in improving our standards support, particularly in our CSS implementation. I feel badly about this, but we have been focused on how to get the most done overall for IE7, so due to our lead time for locking down beta releases and ramping up our team, we could not get a whole lot done in the platform in beta 1. However, I know this will be better in Beta 2

+
+

At the same time, I know Andy’s a stand-up guy and his CSS is top notch, easily some of the best I’ve seen, this problem’s gotta lie with IE7. I guess you can only expect so much from a beta (even a “beta 2”), but that’s a doosie of an error. At least IE7 was easy to uninstall and I’ve got some commentary to leave on the IE7 Beta 2 feedback form.

diff --git a/export/2006-02-06-more-on-ie7-beta-2.md b/export/2006-02-06-more-on-ie7-beta-2.md new file mode 100644 index 0000000..c48781e --- /dev/null +++ b/export/2006-02-06-more-on-ie7-beta-2.md @@ -0,0 +1,33 @@ +--- +title: "More on IE7 Beta 2" +date: 2006-02-06 00:14:54 +comments: true +tags: + - "browsers" + - "business" + - "coding" + - "web standards" +description: "Eric has a very enlightened post for those of you out to document bugs in IE 7 Beta 2 . He also echoed my feelings that this is a beta people!" +permalink: /archives/more-on-ie7-beta-2/ +--- + +

Eric has a very enlightened post for those of you out to document bugs in IE7 Beta 2. He also echoed my feelings that this is a beta people!

+
+

Trying to fix a site that’s “broken” in IE7B2 is kind of like deciding to raze your profitable gas station just because you heard car companies are experimenting with hydrogen fuel cells. When the final version of IE7 comes out, then you can worry about what to do. Maybe your site won’t be “broken” any more, and you won’t have to do anything.

+
+

As Eric also mentioned, over on the IE Blog, they’ve posted an entry about the current state of CSS fixes in IE7 and what you should be able to do now. Also, for those of you who (like me) uninstalled the beta because it caused you to lose your IE6, here are instructions on how to have both (found somewhere in the comments on digg.com):

+ diff --git a/export/2006-02-09-honorable-my-ass.md b/export/2006-02-09-honorable-my-ass.md new file mode 100644 index 0000000..6ae5591 --- /dev/null +++ b/export/2006-02-09-honorable-my-ass.md @@ -0,0 +1,39 @@ +--- +title: "Honorable, my ass!" +date: 2006-02-09 17:59:00 +comments: false +tags: + - "culture & society" +description: "It appears that some Dems, including my Representative from the 3rd District in Connecticut, Rosa De Lauro , are trying to cripple our democracy under the guise of public financing." +permalink: /archives/honorable-my-ass/ +--- + +

It appears that some Dems, including my Representative from the 3rd District in Connecticut, Rosa De Lauro, are trying to cripple our democracy under the guise of public financing.

+

HR 4694, or the “Let the People Decide Clean Campaign Act,” proposed by Representative David Obey (D, WI), would make great strides toward establishing public financing for House of Representatives races, but there’s a catch. The bill contains some very sneaky language regarding how 3rd parties and independent candidates can conduct their campaigns which virtually guarantees the continuance of the two-party system. Let’s break down some of the key provisions:

+ +

OK, this seems pretty reasonable and is to be expected.

+ +

This is good, but starts to show one of the weaknesses of our electoral system. Independent candidates, that is to say candidates who run on no party line, are always at a disadvantage, even under this system which “levels the playing field.

+

If you don’t understand why, take another look at the first provision: “Nominees of partiesthat had averaged 25%.” In other words, candidates, even those with very little party support, who manage to get the endorsement of a party that has averaged 25% of the votes in the district, will get full public funding. By contrast, an independent candidate has no party, so unless (s)he has run previously and garnered 25% of the vote in the last two elections, (s)he gets no public dough. And here’s another oddity… what if there is an incumbant Independent and (s)he decides not to run again, but another Independent steps up to run? Well, that new candidate has to start from scratch and may not get any public money.

+ +

OK, so a candidate or party does not have standing to automatically get public funding. No biggie, but gathering signatures from 10%-20% of the last vote cast to qualify does seem a little steep. I understand wanting to keep out the non-serious candidates, but it seems a little excessive. The St Louis Oracle did the math:

+
+

n Missouri’s 2nd congressional district, a candidate with a party that won less than 25% of the vote in the last two elections would need nearly 70,000 signatures to qualify for the public funding that her/his Democratic and Republican opponents would get automatically, and only signatures from the 2nd District would count. Nearly 35,000 signatures would be required in order to allow the candidate to spend anything at all on the campaign.

+
+ +

Say that again? What? Barred from spending privately raised money? What’s that all about? This is essentially saying “if you can’t get signatures from 10% of the vote in the last election, you can’t run.” Except that it isn’t. The key phrase is “privately raised.” If you have a massive fortune of your own to fund your campaign, goodonya! We all know how in touch the über-wealthy are with the plight of the common (wo)man struggling to feed his/her family on Wal Mart wages.

+

What started out sounding promising rapidly spiralled out fo control into a bill which really isn’t all that democratic. As if incumbent representatives have such a hard time getting re-elected in the first place. Talk about stacking the deck in your favor!

+

The whole thing does beg some interesting questions though:

+ +

I’ll end this on an interesting side note: In 2004, the sponsor fo this bill, Representative Obey, faced a challenger for the first time in his political career: Mike Miles, a Green candidate. Obey refused to debate Miles, saying that he was not a “legitimate” candidate. Miles got one of the highest vote totals of any third party candidate that year (9.37% or 26160 votes), came in second in the race, and has already announced that he’s going to run again.

diff --git a/export/2006-02-10-apparently-some-people-just-dont-care.md b/export/2006-02-10-apparently-some-people-just-dont-care.md new file mode 100644 index 0000000..5fbd665 --- /dev/null +++ b/export/2006-02-10-apparently-some-people-just-dont-care.md @@ -0,0 +1,29 @@ +--- +title: "Apparently some people just don’t care" +date: 2006-02-10 00:10:10 +comments: true +tags: + - "accessibility" + - "business" + - "usability" +description: "On WaSP today, Derek wrote an incredibly poignant post about the NFB lawsuit against Target . In fact, I thought it so relevant to the interactive work we do at Cronin and Company (the ad agency I work for), that I forwarded a copy of..." +permalink: /archives/apparently-some-people-just-dont-care/ +--- + +

On WaSP today, Derek wrote an incredibly poignant post about the NFB lawsuit against Target. In fact, I thought it so relevant to the interactive work we do at Cronin and Company (the ad agency I work for), that I forwarded a copy of it to everyone who works there. The reaction was, for the most part, pretty good (at least from those that read it), but there’s always at least one person who just doesn’t get it.

+

I received the following feedback via email from one of the higher-ups in our company (who shall remain nameless):

+
+

Is Target forcing blind people to shop there? If they don’t does Target hurt them in some way?

+

If it doesn’t meet web standards, why don’t blind people just shop somewhere else? Is Target funded by the government?

+

If Target doesn’t want to change their web site why should I get upset about it? (I don’t hold any Target stock either.)

+
+

I couldn’t belive what I was hearing. What an unhealthy attitude.

+

I sent him an email back. Perhaps it was a bit harsh—though not as harsh as my first draft—but this is something I’m passionate about. I thought I’d share it because I think accessibility is dismissed as “unimportant” far too often:

+
+

I imagine you’ve heard of the Americans with Disabilities Act (ADA) of 1990. The ADA and related laws ensure equal treatment for disabled persons in terms of access, housing, employment, voting, etc. Do you consider wheelchair ramps pointless? Handicapped doors? Elevators? Just curious.

+

The changes necessary to make a website accessible—in a manner similar to the way “brick and mortar” businesses are required by law to be—are not great at all. In fact, we do it routinely with every site we build (at least every one I oversee) at no additional charge and it takes no additional time, just a little forethought. So cost can’t be an excuse to hide behind.

+

But if you want another reason to do it, consider SEO (Search Engine Optimization). Google (and all web spidering applications, for that matter) are the greatest consumers of websites in existence. And they don’t see the pretty pictures and they don’t use a mouse. Semantically marked-up, accessible, web standards-based documents routinely generate higher search rankings than non-semantic/standards/accessible ones because the content is accessible.

+

Then there’s the cost savings in maintenance, the cost savings in server storage space, the cost savings in bandwidth usage, the faster page downloads for your users, and the ability to deploy the same content to multiple devices/media—print, TVs, PDAs and cellphones, just to name a few. In terms of benefits, the list goes on and on, and all of these come at no additional cost when you use web standards.

+

If you don’t see the point in making the effort for selfless reasons, perhaps these will make more sense.

+
+

I realize that most of you reading this have, more likely than not, already joined the accessibility bandwagon. Some selflessly even. I just needed a moment to publicly rant. Thanks.

diff --git a/export/2006-02-14-google-measure-map.md b/export/2006-02-14-google-measure-map.md new file mode 100644 index 0000000..d910b72 --- /dev/null +++ b/export/2006-02-14-google-measure-map.md @@ -0,0 +1,11 @@ +--- +title: "Google Measure Map?!?" +date: 2006-02-14 17:57:44 +comments: false +tags: + - "business" +description: "Just when you though all the good purchases were going to Yahoo! , Google ups the ante by picking up Measure Map , the wonderful blog stats analytics tool from my friends at Adaptive Path . Measure Map is what web analytics should be..." +permalink: /archives/google-measure-map/ +--- + +

Just when you though all the good purchases were going to Yahoo!, Google ups the ante by picking up Measure Map, the wonderful blog stats analytics tool from my friends at Adaptive Path. Measure Map is what web analytics should be all about—it is so simple and so addictive, no wonder Google had to have it.

diff --git a/export/2006-02-14-leap.md b/export/2006-02-14-leap.md new file mode 100644 index 0000000..7045bf5 --- /dev/null +++ b/export/2006-02-14-leap.md @@ -0,0 +1,23 @@ +--- +title: "Leap." +date: 2006-02-14 17:58:34 +comments: true +tags: + - "books & articles" + - "business" + - "presentations" + - "projects & products" +description: "Well, after much debate and deep deliberation, I did it. I quit my job." +permalink: /archives/leap/ +--- + +

Well, after much debate and deep deliberation, I did it. I quit my job.

+

Why? Well, it was the confluence of a number of factors, some having to do with the work environment at Cronin and Company, others stemming from my desire to be my own boss, do more training/speaking engagements, and to write more. Mostly, though, I think I needed a change.

+

I had been working at Cronin for just over three years, both as a contractor and on staff. I was the lead developer in the Digital department and oversaw/had a hand in most of the work we put out. I’m proud of what we accomplished and I am happy to have set the tone for coding there (web standards, accessibility and all that jazz), but I had pretty much gone as far as I could there. I don’t have any interest in project management (I’m nowhere near that organized) and I had no interest in trying to unseat my boss to take over running the department (I find most meetings a waste of time… hallway discussions are much more productive).

+

I’m not much of a gambler, so shifting to full-time contracting was/is a little scarey for me. Luckily, the good folks at Bolt | Peters were gracious enough to offer me a soft landing. Starting Friday, I joined their staff, part-time, as Senior Web Developer and will be working on ethnio.com and the Ethnio application itself. I’m really excited about working with Nate, Mike, Julian and the rest of the B|P crew. I’m also a huge believer in Ethnio. I think it’s going to revolutionize the usability field.

+

In addition to my work with B|P, I’ll also be taking a lot more time to focus on my work through Easy Designs. We have a lot of interesting projects coming up—both client work and some stuff of our own—which we will be unveiling over the coming months. In fact, since the word started getting around on Friday, I’ve been swamped with phonecalls about doing new stuff. It’s great to have a lot of people interested in hiring me, but I wish they had projects which started a little later in the year.

+

I really got the teaching bug in 2005. Maybe it was spending so much time working with Molly on the WOW Web Design Tour and doing training sessions for the EPA and Gartner. I had such a blast spreading the gospel of web standards and seeing the lightbulbs go off above the heads of attendees. I suppose it’s not surprising given that I come from a family of teachers. My hope is that I will be able to do more speaking/training/etc. without the constraints of a day job.

+

Then there’s the writing end of things. Looking back at the past year, I’m amazed at how much I ended up getting out there. Articles just kept flowing out of me for some reason. Some were self-published, others ended up gracing the pages of Digital Web Magazine and A List Apart, the latter being where I also took the graciously offered position of Production Editor.

+

Book work was also in the cards for me as I offered my informal assistance/opinions to Jeremy on early drafts of his fantastic DOM Scripting tome and worked side-by-side with Jen Robbins and Derek Featherstone to completely revamp the 3rd Edition of Web Design in a Nutshell to bring it in-line with web standards and current best practices. The timeline was pretty tight, but I managed to bang out three chapters on CSS hacks, JavaScript, AJAX and the DOM in about 2 weeks of evenings and weekends. It’ll be coming out later this month and it’s been really exciting to have been a part of it. It was such a great experience, in fact, that I’m in the process of brainstorming a few titles of my own… but more on that later.

+

So here I am, making the move to the uncertain/exhilerating/terrifying/fantastic world of freelancing. I am glad I’m doing it and I really appreciate all of the words of encouragement I’ve gotten from friends and colleagues. The web community is truly that… a community. They are a close social safety net who look out for one another and I couldn’t hope to be a part of a better group of people. Many thanks to all of you and I hope I can be as much help if any of you ever decide to consider this leap yourselves.

+

PS - If you want my old job, it’s available.

diff --git a/export/2006-02-18-png-color-oddities-in-ie.md b/export/2006-02-18-png-color-oddities-in-ie.md new file mode 100644 index 0000000..0219517 --- /dev/null +++ b/export/2006-02-18-png-color-oddities-in-ie.md @@ -0,0 +1,15 @@ +--- +title: "PNG color oddities in IE" +date: 2006-02-18 12:21:21 +comments: true +tags: + - "coding" + - "design" + - "web standards" +description: "While working on a new site, I started playing around a little more with 8-bit PNG files, comparing them to GIFs. In a few cases the PNG was smaller (it didn’t used to be that way, but perhaps Photoshop CS 2 does a better job of..." +permalink: /archives/png-color-oddities-in-ie/ +--- + +

While working on a new site, I started playing around a little more with 8-bit PNG files, comparing them to GIFs. In a few cases the PNG was smaller (it didn’t used to be that way, but perhaps Photoshop CS2 does a better job of compressing PNG files or something), so used it. All was good until I started testing the design in IE, where the colors were all off. Here’s a breakdown of how the same graphic (placed as a CSS background image against a background color equal to its own background color) rendered between the two browsers: PNG comparison between Firefox 1.5 and Internet Explorer 6/7B2

+

I am well-aware of the issues regarding IE’s handling of alpha transparency in 24-bit PNGs, but had not heard of any color-related issues with 8-bit PNGs in IE6. I did a test in IE7B2 to see if the error was there too and it was.

+

I did some searching on Google and couldn’t seem to find any documentation on this bug, but it’s certainly something I’d recommend they fix for the final release of IE7. For now, however, the only solutions appear to be adding color-correction to your CSS for IE (if you are dead-set on using an 8-bit PNG) or using a GIF.

diff --git a/export/2006-02-19-further-adventures-in-indifference.md b/export/2006-02-19-further-adventures-in-indifference.md new file mode 100644 index 0000000..7665054 --- /dev/null +++ b/export/2006-02-19-further-adventures-in-indifference.md @@ -0,0 +1,26 @@ +--- +title: "Further adventures in indifference" +date: 2006-02-19 12:16:09 +comments: true +tags: + - "accessibility" + - "business" +description: "As opposed to just adding it to the comments in my original post, I decided to post the continuation of my email conversation with the unnamed executive at my former employer about the Target.com lawsuit as a new entry. This is mostly..." +permalink: /archives/further-adventures-in-indifference/ +--- + +

As opposed to just adding it to the comments in my original post, I decided to post the continuation of my email conversation with the unnamed executive at my former employer about the Target.com lawsuit as a new entry. This is mostly for Derek’s amusement, but I thought of a few other things to say on the subject as well.

+

We’ll start with his response. This is copied directly from his email; I take no credit for the spelling, grammar, etc.:

+
+

I understand all that you are saying. The difference I see is a “public” building like a Target store is an impediment of it doesn’t have a ramp, etc. and it does matter. It is a physical, public places that discriminates if they don’t have the ramps, etc.

+

I just see the web as slightly different. While ‘public’ in a sense, it is just as easy for a disabled person to find a web accessible site as it is to find a non-accessible one. You don’t have to drive or walk anywhere. Just click on a different URL. If Target discriminates against people with disabilities it is their problem, not mine.

+

So if Target doesn’t make their site accessible, they lose for all the reasons you state. And why do I care if Target loses out on all the things you mention? It is just as easy to click on walmart.com or wherever to find the accessibility you need.

+
+

And my response to his:

+
+

I agree that it is not your problem as a citizen of the USA/world/universe/whatever. And, personally, I could care less if most major corporations blipped out of existence. But we (as marketers and people responsible for our clients’ online marketing/branding/presence/etc.) need to be aware of this and know that overcoming this “obstacle” does not take much and that the benefits far outweigh the time and money involved. We need to be able to work with our clients in their best interests, guiding them down the right path from a business standpoint, even if the benefits are not immediately apparent to them.

+

Now I’m not condoning the lawsuit, but it also helps from a PR standpoint not to get sued by a group of citizens with disabilities. After all, who’s going to look like the asshole there?

+
+

After hitting send, I (of course) thought of a bunch of other stuff to say. I will spare you the entirety of my thoughts save this one:

+

It is not just as easy for a disabled person to find a web accessible site as it is to find a non-accessible one. There is a real dearth of e-commerce sites on the web that are accessible. That is a major part of the problem. Perhaps if more e-commerce software companies took a page from Karova, users surfing the web with disabilities or (shock) JavaScript turned off might be able to choose to shop somewhere other than a Target or a Wal-Mart or any other store which does not meet their accessibility (or availability) requirements.

+

Anyway, he hasn’t responded to that last email and perhaps he never will. After all, I gave my notice the next day. And, no, this exchange was not responsible for that in any way.

diff --git a/export/2006-03-06-i-missed-it.md b/export/2006-03-06-i-missed-it.md new file mode 100644 index 0000000..a000e18 --- /dev/null +++ b/export/2006-03-06-i-missed-it.md @@ -0,0 +1,19 @@ +--- +title: "I missed it" +date: 2006-03-06 15:00:54 +comments: true +tags: + - "books & articles" + - "business" + - "coding" + - "CSS" + - "design" + - "JavaScript" + - "web standards" +description: "Apparently I missed the release of Web Design in a Nutshell, 3rd Edition . It hit bookshelves February 23rd. I haven’t seen a physical copy yet, but Jen ’s got a few copies for me when I get down to SXSW ." +permalink: /archives/i-missed-it/ +--- + +

Web Design in a Nutshell, 3rd Edition

+

Apparently I missed the release of Web Design in a Nutshell, 3rd Edition. It hit bookshelves February 23rd. I haven’t seen a physical copy yet, but Jen’s got a few copies for me when I get down to SXSW.

+

Getting prepped for SXSW has me kinda crazed right now, but I wanted to get a quick plug in for the book, since I think it’s such a great reference for doing things the web standards way (and a vast improvement over the 2nd Editon). It was a lot of fun working with Jen on this and I’m looking forward to seeing Derek’s contribution to the title as well. It was also really great to work with Molly, Tantek and Jeremy on the Technical Editing side. Their input was greatly appreciated and, I think, made this book even more valuable.

diff --git a/export/2006-03-09-touchdown.md b/export/2006-03-09-touchdown.md new file mode 100644 index 0000000..76108b7 --- /dev/null +++ b/export/2006-03-09-touchdown.md @@ -0,0 +1,14 @@ +--- +title: "Touchdown…" +date: 2006-03-09 14:57:40 +comments: false +tags: + - "business" + - "conferences" +description: "Kel and I landed in Austin, TX this afternoon. We’re here for SXSWi . It’s like an annual pilgrimage. Molly calls it “geek camp,” but I think of it more as a time to actually see the people I spend so much time with virtually. Between..." +permalink: /archives/touchdown/ +--- + +

Dinner at the Paradise in Austin, TX

+

Kel and I landed in Austin, TX this afternoon. We’re here for SXSWi. It’s like an annual pilgrimage. Molly calls it “geek camp,” but I think of it more as a time to actually see the people I spend so much time with virtually. Between the scant few hours of sleep last night, the flights, and the general insanity of the last few weeks, sleep is looking very good right now.

+

I’ll post some more in the AM, but, in the meantime, you can check out the photos from tonight’s dinner with the usual gang.

diff --git a/export/2006-03-21-geek-camp-wrap-up.md b/export/2006-03-21-geek-camp-wrap-up.md new file mode 100644 index 0000000..2e84d1e --- /dev/null +++ b/export/2006-03-21-geek-camp-wrap-up.md @@ -0,0 +1,27 @@ +--- +title: "“Geek Camp” wrap-up" +date: 2006-03-21 12:12:00 +comments: true +tags: + - "business" + - "presentations" +description: "I had a fantastic time at SXSWi this year. It was great to catch up with old friends, make some new ones, and see what everyone’s been working on for the last year. If you’re interested in seeing the shennanigans you can check out my..." +permalink: /archives/geek-camp-wrap-up/ +--- + +

+

+

+ I had a fantastic time at SXSWi this year. It was great to catch up with old friends, make some new ones, and see what everyone’s been working on for the last year. If you’re interested in seeing the shennanigans you can check out my SXSWi 2006 photostream. Highlights are below:

+

+How to Bluff Your Way in DOM Scripting — Jeremy and I had a blast walking the audience through the wonderful world of DOM Scripting. And, based on the audience feedback, we made quite an impact too. Hopefully we managed to break down some of the misconceptions about JavaScript and the DOM being hard to work with. After all, DOM Scripting doesn’t suffer from nearly as many compatibility issues as CSS. There are some great reviews and blow-by-blows out there if you’re interested.

+

+Web Standards and Search/SEO — It was nice to finally get some dialogue going between the web standards community, the search engines and the SEO folks. It was great to have such incredible people working to make it happen too. Many thanks to Molly, Peter, Tim, Andy, Ed and Eric for putting in the time (and putting up with a little abuse) to get the ball rolling.

+

+The peopleCameron Adams, John Allsopp, Faruk Ateş, Kimberly Blessing, Nate Bolt, Kyle Bradshaw, Andy Budd, Tantek Çelik, Andy Clarke, Craig Cross, Mike Davidson, James Edwards, Derek Featherstone, Nick Finck, Jesse James Garrett, Porter Glendinning, Jenifer Hanen, Jon Hicks, Kenneth Himschoot, Molly Holzschlag, Shaun Inman, Lauren Isaacson, Leslie Jensen, Chris Kaminski, Jeremy Keith, Jessica Keith, Geert Leyseele, Cindy Li, Ian Lloyd, Stuart Langridge, Ethan Marcotte, Tim Mayer, Eric, Kat & Carolyn Meyer, Drew McLellan, Chris Mills, Cameron Moll, Peter Morville, Matt Mullenweg, Dunstan Orchard, Veerle Pieters, Jeff Robbins, Jen Robbins, D. Keith Robinson, Richard Rutter, Jason “Stan” Santa Maria, Christopher Schmitt, Maxine Sherrin, Eris Stassi, Greg Storey, Elly Thompson, Mark Trammell, Jeff Veen, Sergio Villarreal, Khoi Vinh, Rob Weychert, Meri Williams, Simon Willison, Jeffrey Zeldman, and a bunch more I’ve probably left out (but not forgotten, mind you).

+

+Getting my wings (and stinger) — Faruk and I were asked (and agreed) to join the Web Standards Project (WaSP) while at SXSW. As the new kids on the block, we’ll be dealing with comment moderation on the new site, so please… be gentle.

+

+The partiesy — I didn’t make it to many parties this year, mostly because Kel was ill, but the one I did make it to—hosted by Adaptive Path, Odeo, and Consumating—was fantastic. Next year, Kel & I will take lots of vitamins to make sure we’re in top form for the evening activities.

+

+ So now, after a long night of flying and a few days of recouperating, it’s back to work.

diff --git a/export/2006-03-22-i-wish-id-known-that.md b/export/2006-03-22-i-wish-id-known-that.md new file mode 100644 index 0000000..1f6a388 --- /dev/null +++ b/export/2006-03-22-i-wish-id-known-that.md @@ -0,0 +1,20 @@ +--- +title: "I wish I’d known that…" +date: 2006-03-22 14:08:39 +comments: false +tags: + - "culture & society" + - "presentations" +description: "So, upon returning home from SXSW and cracking open the new issue of Seed , I read something I wish I’d known before:" +permalink: /archives/i-wish-id-known-that/ +--- + +

+

So, upon returning home from SXSW and cracking open the new issue of Seed, I read something I wish I’d known before:

+
+

Having sex is the best way to prepare for a speech.

+
+

How did I miss that? Where was I when that little nugget was passed out?

+

Prior to SXSW, Dave (and his many commenters) had given some great tips for giving memorable (and coherent) presentations. I had kinda skimmed over it at the time, but, after re-reading it, I didn’t see that mentioned anywhere.

+

To make sure the clever editors at Seed were not simply pulling my leg (or some other appendage), I did a little digging and found that this story was everywhere in late January. Where the hell was I?

+

Well, it settles one thing… Kel is definitely travelling with me everywhere I speak now ;-)

diff --git a/export/2006-03-26-job-newyork-presbyterian-hospital.md b/export/2006-03-26-job-newyork-presbyterian-hospital.md new file mode 100644 index 0000000..e0ebd4d --- /dev/null +++ b/export/2006-03-26-job-newyork-presbyterian-hospital.md @@ -0,0 +1,14 @@ +--- +title: "Job: NewYork-Presbyterian Hospital" +date: 2006-03-26 14:07:28 +comments: false +tags: + - "business" +description: "The fine follks over at NewYork-Presbyterian Hospital are looking for a designer/developer to join their 5-person Intranet team. This year, they will be redesigning their employee intranet and moving from a static website to a..." +permalink: /archives/job-newyork-presbyterian-hospital/ +--- + + +

The fine follks over at NewYork-Presbyterian Hospital are looking for a designer/developer to join their 5-person Intranet team. This year, they will be redesigning their employee intranet and moving from a static website to a role-based portal. They will also be implementing a new CMS to run the show.So if you’re an experienced designer/developer with good information architecture skills (an ar comfortable working in a Microsoft shop) who is interested in leading a team (not just coding alone in the dark), give them a shout.

diff --git a/export/2006-03-28-speeding-up-your-code-with-the-bitwise-operator.md b/export/2006-03-28-speeding-up-your-code-with-the-bitwise-operator.md new file mode 100644 index 0000000..720fbda --- /dev/null +++ b/export/2006-03-28-speeding-up-your-code-with-the-bitwise-operator.md @@ -0,0 +1,17 @@ +--- +title: "Speeding up your code with the Bitwise Operator (&)" +date: 2006-03-28 14:01:11 +comments: false +tags: + - "books & articles" + - "business" + - "coding" + - "Flash & ActionScript" + - "PHP" +description: "While building a Flash game, I wrote some code to alternate through squares on a grid system and it seemed rather slow. My code made use of the % (modulo) operator and, thinking that was the cause, I went in search of a better solution..." +permalink: /archives/speeding-up-your-code-with-the-bitwise-operator/ +--- + +

+

While building a Flash game, I wrote some code to alternate through squares on a grid system and it seemed rather slow. My code made use of the % (modulo) operator and, thinking that was the cause, I went in search of a better solution. I blew the dust off the Bitwise operator (&) and researched what it actually does. As it turns out, this little bit of programming’s past can be quite handy.

+

Comments & corrections are always welcome and if you have any similiar tricks to share, I’d love to hear about them.

diff --git a/export/2006-04-04-getting-naked.md b/export/2006-04-04-getting-naked.md new file mode 100644 index 0000000..b8cb858 --- /dev/null +++ b/export/2006-04-04-getting-naked.md @@ -0,0 +1,16 @@ +--- +title: "Getting Naked" +date: 2006-04-04 15:07:17 +comments: true +tags: + - "(x)HTML" + - "business" + - "coding" + - "CSS" + - "design" + - "web standards" +description: "We feel bad about missing Grey Tuesday , CSS Reboot and all the April Fools shennanigans (too much work, not enough sleep), but we finally found something we can join/support in with little to no difficulty: CSS Naked Day . For the..." +permalink: /archives/getting-naked/ +--- + +

We feel bad about missing Grey Tuesday, CSS Reboot and all the April Fools shennanigans (too much work, not enough sleep), but we finally found something we can join/support in with little to no difficulty: CSS Naked Day. For the whole of today (April 5th), we’ve turned the CSS off for both this site and Easy Designs. We hope you enjoy their rich, semantic goodness.

diff --git a/export/2006-04-14-westhost-gets-rails.md b/export/2006-04-14-westhost-gets-rails.md new file mode 100644 index 0000000..684c735 --- /dev/null +++ b/export/2006-04-14-westhost-gets-rails.md @@ -0,0 +1,13 @@ +--- +title: "WestHost gets Rails" +date: 2006-04-14 14:50:29 +comments: false +tags: + - "books & articles" + - "Ruby & Rails" + - "servers" +description: "I received an email announcement yesterday that WestHost (my host of choice) is going to be offering Ruby on Rails for install through its Site Manager. I guess it means that, sometime this month, my tutorial on the subject will no..." +permalink: /archives/westhost-gets-rails/ +--- + +

I received an email announcement yesterday that WestHost (my host of choice) is going to be offering Ruby on Rails for install through its Site Manager. I guess it means that, sometime this month, my tutorial on the subject will no longer needed. Still, I’m glad it will be a bit easier to set up Rails on WestHost now.

diff --git a/export/2006-04-20-now-hear-this.md b/export/2006-04-20-now-hear-this.md new file mode 100644 index 0000000..4ed2916 --- /dev/null +++ b/export/2006-04-20-now-hear-this.md @@ -0,0 +1,14 @@ +--- +title: "Now hear this" +date: 2006-04-20 12:52:24 +comments: false +tags: + - "coding" + - "JavaScript" + - "presentations" + - "web standards" +description: "In case you missed the real thing, you can now listen to the podcast of How to Bluff Your Way in DOM Scripting , the presentation international male model Jeremy Keith and I gave at SXSW this year. And If you want to follow along, you..." +permalink: /archives/now-hear-this/ +--- + +

In case you missed the real thing, you can now listen to the podcast of How to Bluff Your Way in DOM Scripting, the presentation international male model Jeremy Keith and I gave at SXSW this year. And If you want to follow along, you can pick up the slides on DOMScripting.com.

diff --git a/export/2006-04-27-if-i-can-make-it-there.md b/export/2006-04-27-if-i-can-make-it-there.md new file mode 100644 index 0000000..490c439 --- /dev/null +++ b/export/2006-04-27-if-i-can-make-it-there.md @@ -0,0 +1,14 @@ +--- +title: "If I can make it there…" +date: 2006-04-27 14:45:48 +comments: true +tags: + - "JavaScript" + - "presentations" +description: "That’s right, I’m coming to NYC to join Jeffrey , Eric , “ Stan ” , Khoi & Adam to deliver An Event Apart in its new 2-day format. It’s gonna be a blast and I’m incredibly honored that I was invited to join in the fun. I’ll be dropping..." +permalink: /archives/if-i-can-make-it-there/ +--- + +

+

That’s right, I’m coming to NYC to join Jeffrey, Eric, Stan, Khoi & Adam to deliver An Event Apart in its new 2-day format. It’s gonna be a blast and I’m incredibly honored that I was invited to join in the fun. I’ll be dropping a few teasers between now and July on what I’ll be talking about, but it’s gonna rock your socks off… At least I hope so.

+

Anyway, you can read the official teaser over at the AEA site.

diff --git a/export/2006-04-28-scroll-and-flash.md b/export/2006-04-28-scroll-and-flash.md new file mode 100644 index 0000000..c5ae630 --- /dev/null +++ b/export/2006-04-28-scroll-and-flash.md @@ -0,0 +1,17 @@ +--- +title: "Scroll and Flash" +date: 2006-04-28 02:49:28 +comments: false +tags: + - "animation" + - "JavaScript" + - "presentations" +description: "At SXSW , I gave a sneak peek at the new bizhub Pro site I built for Konica Minolta and, in particular, the “scroll and flash” usability enhancement I added to the product pages. I have gotten a lot of questions about it and the..." +permalink: /archives/scroll-and-flash/ +--- + +

At SXSW, I gave a sneak peek at the new bizhub Pro site I built for Konica Minolta and, in particular, the “scroll and flash” usability enhancement I added to the product pages. I have gotten a lot of questions about it and the technique even generated some discussion over on Geoffrey’s site. Well, the site finally launched and you can now see the “scroll and flash” for yourself.

+

+

To check it out, go to a product page (the bizhub PRO 920, for instance) and click one of the links on the upper right of the focal image and watch the show (or you can go directly to a bookmark). AJAX is used to refresh the page content (with bookmarkable links) and then the “scroll and flash” takes over. Feel free to take a gander at the JS file to see how it’s done.

+

I have to give it up to Shaun Inman and Thomas Fuchs as it was their hard work that made this easy for me to do.

+

Update: It looks like someone has not taken proper care in managing these pages since I left Cronin and Company (where I had built the site), so not all of the links appear to be working (because their anchors have been removed… tsk, tsk). I’ve let Cronin know and hopefully that will be fixed soon.

diff --git a/export/2006-05-06-web-standards-sex-partners-spam.md b/export/2006-05-06-web-standards-sex-partners-spam.md new file mode 100644 index 0000000..475ad50 --- /dev/null +++ b/export/2006-05-06-web-standards-sex-partners-spam.md @@ -0,0 +1,20 @@ +--- +title: "Web standards, sex partners & spam" +date: 2006-05-06 14:43:46 +comments: true +tags: + - "humor" + - "web standards" +description: "I just got the best comment spam ever:" +permalink: /archives/web-standards-sex-partners-spam/ +--- + +

I just got the best comment spam ever:

+
+

I’m very, very impressed that this sort of work is being done; Web Design is getting stagnant with people using just styled block-level elements to produce artwork. The incorporation of SVG into sites excites me a lot.

+

How long do you expect it will take for this sort of technology to be widespread?

+

Obviously you can only speak about WebKit realistically, but if it’s going to take ten years for IE Win to gain (full) support, we can’t design with it.

+

I’m amused by the “Becoming more important” line in the first paragraph. This has been a HUGE problem for years - ever since HTML-2.0 was introduced to be more of a layout language and less of a markup language. For an example, you just have to look at this site sex partners [link removed -ed.]. Why is all the text crammed over on the left side of the page with a big blank space on the right side?

+

Why is the default font tiny and unreadable? Fortunately most browsers now let you override the latter problem.

+
+

It is a little disjointed, but the fact that it mentions “block-level elements,” SVG, WebKit and (of course) “sex partners” is hysterical. Someone’s been paying attention in my training sessions.

diff --git a/export/2006-05-14-feedback-on-feedback.md b/export/2006-05-14-feedback-on-feedback.md new file mode 100644 index 0000000..b30bca6 --- /dev/null +++ b/export/2006-05-14-feedback-on-feedback.md @@ -0,0 +1,40 @@ +--- +title: "Feedback on feedback" +date: 2006-05-14 14:37:25 +comments: false +tags: + - "JavaScript" + - "presentations" +description: "SXSW has released their “ Honor Roll” for 2006 and “ How to Bluff Your Way in DOM Scripting ” managed to make it pretty darn close to the top. I had no idea there was such a thing, but apparently audience feedback is tallied and the..." +permalink: /archives/feedback-on-feedback/ +--- + +

SXSW has released their Honor Roll” for 2006 and “How to Bluff Your Way in DOM Scripting” managed to make it pretty darn close to the top. I had no idea there was such a thing, but apparently audience feedback is tallied and the sessions are graded and then ranked. I’m pretty proud of our 4.69 “GPA” too — it put us solidly in the Top 10. The two other sessions I participated in (“Web Standards & Search Engines” and “Web Standards & SEO”) also garnered some respectable scores (4.01 and 4.06 respectively) and made the list.

+

My co-presenters/panelists all did an incredible job (Andy, Ed, Eric, Jeremy, Molly, Peter, and Tim: my hat is off to you), but my greatest thanks goes out to everyone who took the time not only to to come to our sessions, but to provide us with feedback on them as well.

+

The feedback provided by an audience is invaluable. I know it’s sometimes annoying to fill out those comment cards (especially when you’re racing to get a seat at the next session), but it is those very comment card that help us hone our skills as speakers/presenters/teachers. Your feedback is what makes us better the next time around. Seriously, it means a lot — evenespecially criticism.

+

When developing a session, you write the description to capture the essence of the talk and (hopefully) set expectations for the depth and breadth of coverage, but you’re never sure just how that will be interpreted by attendees. The only way you can know how successful you were at planning, describing and giving the presentation is by receiving audience feedback. That’s why it is so crucial.

+

To me, a session is a success when the majority of the feedback tells me

+
    +
  1. I didn’t lose anybody, and
  2. +
  3. people learned something.
  4. +
+

But it’s hard to strike that balance too. You never really know the audience’s comfort level, especially on the more technical or programming-related topics. That said, you do know you’ve hit the mark (at least for most people) when you receive feedback like this:

+
+

Great breakdown of concepts…

+
+
+

…interesting to even an experienced DOM coder

+
+
+

Broke down things to a real level…

+
+
+

…well organized and not so deep that beginners would get lost

+
+
+

Great/useable content that we can take back to the office.

+
+
+

I’ve learned something useful today!

+
+

Thank you very much for all of your feedback and please keep on commenting.

diff --git a/export/2006-08-11-an-event-apart-nyc-post-mortem.md b/export/2006-08-11-an-event-apart-nyc-post-mortem.md new file mode 100644 index 0000000..58558b2 --- /dev/null +++ b/export/2006-08-11-an-event-apart-nyc-post-mortem.md @@ -0,0 +1,23 @@ +--- +title: "Belated post mortem: An Event Apart NYC" +date: 2006-08-11 00:17:50 +comments: false +tags: + - "JavaScript" + - "presentations" + - "web standards" +description: "I know, I know, I haven’t posted anything in the aftermath of AEA - NYC . Things have been a little busy on the homefront (new roof, kitchen remodelling and the firing of a lazy, lying contractor) and since moving into the new offi ce..." +permalink: /archives/an-event-apart-nyc-post-mortem/ +--- + +

+

I know, I know, I haven’t posted anything in the aftermath of AEA-NYC. Things have been a little busy on the homefront (new roof, kitchen remodelling and the firing of a lazy, lying contractor) and since moving into the new offi + +ce, I haven’t really felt much like bringing my laptop up to blog in the evening (instead choosing to enjoy spending my time with Kelly). Sorry.

+

To create a nice triumvirate of excuses, I’ll toss in this one too: I’ve been spending a considerable amount of time off-line, working on a chapter for an as-yet unannounced web standards book (more on that soon) in addition to plying my technical editing skills to the latest edition (3rd, I believe) of Jen’s Learning Web Design (note: the link still goes to the 2nd edition) and Andy’s Transcending CSS: The Fine Art of Web Design.

+

Anyway, so, An Event Apart…
I had such a great time at this conference. Not only was it a pleasure to speak at, but I met some awesome folks and got to spend more time with Jeffrey, Eric, Jason, Khoi, Tantek, and Rob. It was also great to spend more time with (Dr.) Kat and Carolyn (Eric’s family) and to finally meet Carrie and Ava (Jeffrey’s family) and Liz (Jason’s bride). I also got to hang out a bit more with Dan & Jon (both formerly of Pixelworthy). They are two truly fantastic gents and they played their roles as the AEA go-fers/whipping boys with gusto. Seriously, these guys rock. From purely a social aspect, AEA was fantastic; like an intimate SXSW.

+

As for the sessions, I thought they were incredible. I thoroughly enjoyed all of Jeffrey’s talks. He is such a great speaker, capable of moving an audience with even the simplest turn of phrase. It was nice to finally see Eric giving a CSS talk too. I’ve only seen him talk microformats (at SXSWi 2005) and general web standards stuff relating to search (on one of my panels at SXSWi 2006). I was delighted to find that he and I tackle layout problems very much the same way… making my methods not seem quite so mad. “Stan” was also a fantastic solo act (I’ve only seen him on panels) and he walked us through how the ALA redesign came to be. Being that I came on board just after the relaunch, it was nice to get some of the backstory.

+

The guest speakers were also a lot of fun to listen to. ze frank had me in tears I was laughing so hard and I had no idea just how much work Khoi puts into blogging and other non-NYT-related activities. It’s amazing he gets any sleep at all. Tantek’s microformats talk was also good because it helped amalgamate a lot of the disperate (and not always clear) information available on microformats into something usable, allowing me to take it beyond the simple hcard and hcal stuff I’ve been using for the last couple months.

+

I also thought the design and code critiques were excellent. The design one could have been a little more hmm, how to put this… aggressive? But design is such a subjective area, it’s hard to critique without some semblance of a creative brief or at least an understanding of the audience. Eric, Tantek and I were a little less forgiving in the code critique, but I think we brought up some really important points and kept it educational for everyone. Tantek’s got a nice write-up of the proceedings over at his site.

+

In all, I had a great time at An Event Apart. You may be thinking sure, but you were a speaker, but I am positive I would have enjoyed it equally as much as an attendee. There were great people, great talks and the food was fantastic.

+

If you feel so inclined, you can check out my photostream from the event as well as the AEA-NYC group photos over at Flickr.

diff --git a/export/2006-09-26-new-article-tour-dates-and-feed-changes.md b/export/2006-09-26-new-article-tour-dates-and-feed-changes.md new file mode 100644 index 0000000..1636d4f --- /dev/null +++ b/export/2006-09-26-new-article-tour-dates-and-feed-changes.md @@ -0,0 +1,21 @@ +--- +title: "New article, tour dates, and feed changes" +date: 2006-09-26 14:49:15 +comments: true +tags: + - "(x)HTML" + - "books & articles" + - "business" + - "CSS" + - "presentations" +description: "Hello, my name is Aaron Gustafson and I’m a delinquent blogger. It’s been over a month since my last confession post ." +permalink: /archives/new-article-tour-dates-and-feed-changes/ +--- + +

Hello, my name is Aaron Gustafson and I’m a delinquent blogger. It’s been over a month since my last confessionpost.

+

If it makes any difference, I’ll say that I am sorry, I’ve just been a little busy of late. Those of you keeping up with me via my Flickr stream or Plazes will see why: lots of travelling. It doesn’t seem to be ending anytime soon, but I have a bit of downtime today so I thought I’d post some updates.

+

First off, I have a new article up on Digital Web Magazine that is all about the button. If you spend as much time with web forms as I do, I highly recommend checking it out. For some of you, the techniques may seem like old hat, but there are a lot of people out there who still haven’t realized the real power of the button element. On another publishing note, I’ve gotten a promotion to Technical Editor at A List Apart, which is sweet. Many thanks to Erin, Jeffrey and the rest of the team.

+

Also, if you haven’t been to this site lately or don’t follow my events feed on Upcoming.org, I will be speaking at The AJAX Experience in Boston late next month and I have also been booked to speak as part of Web Directions North in Vancouver early next year. I will also be co-leading a one-day workshop with Malarkey while I’m there, if you’re up for some serious CSS-meets-DOM scripting magic.

+

Finally, I’m starting to do some tidying up with regard to my feeds. I’ve had a FeedBurner account for ages, but hadn’t ever really used it until today. I’ve created a new feed for this site which also incorporates my ma.gnolia bookmarks and Flickr photos (which, of late, have seen a lot more attention than this site). For the three of you using the old feed, I’d appreciate it if you could move over to this new one as I may disable the old feed at some point.

+

Anyway, back into the fray… I’m sorry for not writing more often.

+

Mea culpa,
Mea culpa,
Mea maxima culpa.

diff --git a/export/2006-11-22-book-report-nickel-and-dimed.md b/export/2006-11-22-book-report-nickel-and-dimed.md new file mode 100644 index 0000000..993f378 --- /dev/null +++ b/export/2006-11-22-book-report-nickel-and-dimed.md @@ -0,0 +1,21 @@ +--- +title: "Book Report: Nickel and Dimed" +date: 2006-11-22 19:58:00 +comments: true +tags: + - "culture & society" +description: "I just finished reading Barbara Ehrenreich’s Nickel and Dimed and it really opened my eyes. Clevery subtitled “How (Not) To Get By in America,” the book is a chronicle of Ehrenreich’s “adventures“ in survival as a member of the low-wage..." +permalink: /archives/book-report-nickel-and-dimed/ +--- + +

I just finished reading Barbara Ehrenreich’s Nickel and Dimed and it really opened my eyes. Clevery subtitled “How (Not) To Get By in America,” the book is a chronicle of Ehrenreich’s “adventures“ in survival as a member of the low-wage workforce that serves our meals, cleans our homes, and cares for our elderly.

+

The book is divided into three sections, each of which finds Ehrenreich in a new location, looking for work and a place to live. Her first stop was Key West, where she took a job as a waitress at one restaurant before moving to a busier one attached to a hotel. A bit later, she tried to increase her income by picking up some additional work as a maid at said hotel, but the exhaustion (and accompanying pain) got to her and she decided just to stick with the waitressing.

+

In the second section, she journeyed to Maine, where she picked up a job working for a cleaning service during the week and working at a nursing home on the weekends. It was the “off season” in Maine, meaning weekly rents were far cheaper at the extended-stay motels, but she still had problems making ends meet. There’s no doubt that the tourist season would have bankrupted her or had her sharing a single-room efficiency with several other workers.

+

Finally, it was on to the heartland of America, Minnesota, where she was shocked to discover a severe affordable housing shortage. She took a position as an “associate” at Wal-Mart to gain additional insight into the largest private employer in the United States (possibly the world), but no matter how hard she tried, she just could not afford to live, even in the seediest of motels with assistance from local charities and the State.

+

In each location, Ehrenreich tried to live as cheaply as possible, often finding shelter at hotels, motels, and trailer parks that cater to those unable to afford an apartment. And, in Minneapolis, when she couldn’t even afford to do that, a local organization suggested she live at a shelter (while working full-time at Wal-Mart, mind you) until she had saved enough to afford the first month’s rent and security deposit for an apartment in the tight real estate market.

+

While it is arguable that she could not even hope to capture the complete experience by spending just a month in each place (and, of course, being able to return to her “real” life at any time), she was able to glean a good deal of insight into the struggles of low wage workers in this country. Her final chapter, in fact, articulated perfectly some of the thoughts and feelings I’ve had for some time. Here’s an excerpt:

+
+

When poor single mothers had the option of remaining out of the labor force on welfare, the middle and upper middle class tended to view them with a certain impatience, if not disgust. The welfare poor were excoriated for their laziness, their persistence in reproducing in unfavorable circumstances, their presumed addictions, and above all for their “dependency.” Here they were, content to live off “government handouts,” instead of seeking “self-sufficiency,” like everyone else, through a job. They needed to get their act together, learn how to wind an alarm clock, get out there and work. But now that government has largely withdrawn its “handouts,” now that the overwhelming majority of the poor are out there toiling in Wal-Mart or Wendy’s—well, what are we to think of them? Disapproval and condescension no longer apply, so what outlook makes sense?

+

Guilt, you may be thinking warily. Isn’t that what we’re supposed to feel? But guilt doesn’t go anywhere near far enough; the appropriate emotion is shame—shame at our own dependency, in this case, on the underpaid labor of others. When someone works for less pay than she can live on—when, for example, she goes hungry so that you can eat more cheaply and conveniently—then she has made a great sacrifice for you, she has made a gift of some part of her abilities, her health, and her life. The “working poor,” as they are approvingly termed, are in fact the major philanthropists of our society. They neglect their own children so that the children of others will be cared for; they live in substandard housing so that other homes will be shiny and perfect; they endure privation so that inflation will be low and stock prices high. To be a member of the working poor is to be an anonymous donor, a nameless benefactor, to everyone else.

+
+

I highly recommend checking this book out if your a social activist interested in pushing for a living wage or are simply interested in the nature of labor and the workforce in America.

diff --git a/export/2007-02-04-talking-with-microsoft-about-ienext.md b/export/2007-02-04-talking-with-microsoft-about-ienext.md new file mode 100644 index 0000000..95b3393 --- /dev/null +++ b/export/2007-02-04-talking-with-microsoft-about-ienext.md @@ -0,0 +1,35 @@ +--- +title: "Talking with Microsoft about IE.next" +date: 2007-02-04 18:49:50 +comments: false +tags: + - "browsers" + - "JavaScript" + - "web standards" +description: "You may recall that the DOM Scripting and Microsoft task forces, in collaboration with JS Ninjas, had been compiling a list of issues, needs, and wants for IE .next over the last few months (a list many of you contributed to as well..." +permalink: /archives/talking-with-microsoft-about-ienext/ +--- + +

You may recall that the DOM Scripting and Microsoft task forces, in collaboration with JS Ninjas, had been compiling a list of issues, needs, and wants for IE.next over the last few months (a list many of you contributed to as well, via your feedback). The list was to focus on what we wanted to see happen in terms of JavaScript support (as IE7 didn’t get much of an update in that area), but when it came down to it, there were other areas we really felt needed some love.

+

The list

+

Last week, our groups voted for what we each saw as priorities and those votes were tallied to create a final list for me to present in Redmond. Though there is obviously a great deal more we want to see in IE.next, we felt several things were critical and wanted to focus on those as a starting point.

+

Tied for first place, in order of priority, were some sort of fast, arbitrary node-matching API and better error reporting. In the realm of DOM Scripting, node-matching is key (just look at the number of scripts out there performing node matching based on CSS selectors, etc.), so being able to tap into a native XPath implementation (which we generally favored over the Selectors API) would greatly improve the speed of script execution. As for the error reporting, perhaps Justin Palmer (of JS Ninjas) said it best:

+
+

We could possibly find ways to fix all the other problems if we could tell what the hell was breaking and why. Without better error reporting, the remaining stuff on that list is just giving us a bigger gun to shoot ourselves in the foot with.

+
+

Next up in our list was a desire for mutable DOM prototypes. This would address the issues that arise from IE’s implementation of DOM objects in JavaScript, where elements of the core DOM are not derived from the standard Object prototype. While not technically a standards-support issue, this request does not conflict with standards and it does provide JavaScript developers with the ability to address some of the issues the IE team may not be able to address themselves in the next release. As Andrew Dupont (another Ninja) remarked, I think it’s reasonable to ask that a DOM implementation in JavaScript behave like it’s part of JavaScript.

+

Next up was a biggie: bring IE’s event system in line with the W3C event model. This has been an issue for a lot of developers and the code to equalize the two event systems makes up a significant chunk of all of the major JS libraries. Getting IE to implement the W3C event system would be a real boon for standards support and would drop the size of many libraries considerably.

+

Finally, the last of our top 5 was not a JS issue, but rather a CSS one: implement generated content. I don’t know that I really need to get into the reasons why this would be really nice to have.

+

Two “honorable mentions” were included in the list as well: fixing the issues with getAttribute() and setAttribute() and starting to implement some of the features of JS 1.7 (such as block-scope variables using let, etc.).

+

Not willing to let the IE team off that easy, the document presented also highlighted several other issues which really need addressing including (among others)

+ +

The meeting

+

In Redmond, I met with Pete LePage, a Product Manager at Microsoft Web Platform and Tools, and several other key members on the IE team. We discussed the list and its implications in great detail for nearly two hours. While I am not at liberty to discuss all of the details of the meeting, I can say for certain that the group I met with was keenly aware of the issues we brought up and are eager to address them. One team member even said that he could have easily guessed our top 5.

+

The one concern they have—especially with regard to the event model and getAttribute()/setAttribute()—is that any adjustments they make to bring IE in line with the standards not “break the web” for the large number of sites using the proprietary IE event model, etc. We discussed this particular topic at length as it is a valid concern and I’m happy to say that I think we’re close to a solution on that front.

+

I came away from this meeting with a real sense of hope about where IE is going and am really encouraged by their willingness to engage the standards community (and web developers as a whole) in dialog like this. We did not resolve every issue in our two-hour talk, but I was assured that this was only the first of many steps toward improving IE.next. The IE team wants to continue this conversation and to continue to elicit feedback from the web community as a whole as things progress.

diff --git a/export/2007-02-27-plazes-enters-the-real-world-6.md b/export/2007-02-27-plazes-enters-the-real-world-6.md new file mode 100644 index 0000000..1fce2cb --- /dev/null +++ b/export/2007-02-27-plazes-enters-the-real-world-6.md @@ -0,0 +1,13 @@ +--- +title: "Plazes enters the real world" +date: 2007-02-27 14:14:27 +comments: true +tags: + - "culture & society" + - "humor" +description: "From the plazes photostream ." +permalink: /archives/plazes-enters-the-real-world-6/ +--- + +

+

From the plazes photostream.

diff --git a/export/2007-03-08-heading-south.md b/export/2007-03-08-heading-south.md new file mode 100644 index 0000000..8d9e734 --- /dev/null +++ b/export/2007-03-08-heading-south.md @@ -0,0 +1,18 @@ +--- +title: "Heading South" +date: 2007-03-08 01:46:32 +comments: false +tags: + - "business" + - "JavaScript" + - "presentations" + - "usability" + - "web standards" +description: "Tomorrow morning I’ll be making my annual pilgrimage to SXSW (a.k.a. geek camp). In between catching up with friends, drinking, and checking out some of the excellent panels, I will be co-presenting two 25-minute “power sessions,” a new..." +permalink: /archives/heading-south/ +--- + +

Tomorrow morning I’ll be making my annual pilgrimage to SXSW (a.k.a. geek camp). In between catching up with friends, drinking, and checking out some of the excellent panels, I will be co-presenting two 25-minute “power sessions,” a new format for the conference.

+

On Saturday afternoon, Sarah Nelson of Adaptive Path will join me to present the latest iteration of “Ruining the User Experience,” which I debuted at The AJAX Experience in Boston last October. The session has been completely revamped and I am hopeful it will inspire more developers to work in tandem with user experience folks and vice versa.

+

The following afternoon, Andrew Dupont and I will be talking about “The Future of JavaScript.” We’ll be talking at length about the advancements in JavaScript 1.7 and other assorted geekery.

+

Apart from those sessions, which I obviously need to attend, I haven’t really made up my mind as to what I want to see. There’s just so many good sessions. I guess I’ll figure it out when I get there. I am looking forward to bowling though.

diff --git a/export/2007-03-26-and-now-the-fun-begins.md b/export/2007-03-26-and-now-the-fun-begins.md new file mode 100644 index 0000000..99bf77b --- /dev/null +++ b/export/2007-03-26-and-now-the-fun-begins.md @@ -0,0 +1,25 @@ +--- +title: "And now the fun begins" +date: 2007-03-26 05:24:03 +comments: true +tags: + - "(x)HTML" + - "accessibility" + - "business" + - "coding" + - "CSS" + - "design" + - "JavaScript" + - "presentations" + - "usability" + - "web standards" +description: "Today marked the last day of my “work” here at SXSW and now it’s play time. It’s only been two days of the conference, but it seems like I’ve already done a week’s worth of stuff. My two sessions both went extremely well from my..." +permalink: /archives/and-now-the-fun-begins/ +--- + +

Today marked the last day of my “work” here at SXSW and now it’s play time. It’s only been two days of the conference, but it seems like I’ve already done a week’s worth of stuff. My two sessions both went extremely well from my perspective and the feedback I’ve received has also been very good so far.

+

+

Ruining the User Experience” was yesterday and I think Sarah and I worked really well together. It flowed well and felt really tight, so I was extremely happy. And being that we were speaking in the “rock and roll room” to a much larger audience than I would have expected, I’m very pleased it went so well. I think we managed to pack a lot of good information into the 25 minutes and nailed the time pretty well dead-on. I do kind of wish the session had been a bit longer as I would have liked to guide the attendees through a few more examples, but I still think we managed to open a lot of eyes (and minds), so I am happy about that. And we left them wanting more, which is never a bad thing either.

+

I’m not sure exactly how helpful they will be for people (at least not until the audio is posted), but I have uploaded the slides for the session. For those who saw me give a session by the same name at The AJAX Experience in October, this is a real departure from what you saw. I tore the old one to shreds and built this new one from scratch. Attendees at AjaxWorld in New York next week will be treated to a solo 45-minute version of this session with a few added examples.

+

My second session, “The Future of JavaScript” was another 25-minute “power session” and I think it went equally well. It was much more geeky than most of my other sessions have been, with tons of code samples demonstrating some of the really cool stuff in JavaScript 1.6 and 1.7. As John Resig mentioned to us at bowling tonight, Andrew and I were going through the features of the two language upgrades “pretty rapid-fire,” but I think it worked well as a power session because we came in fast and hit the packed room with a lot of new information. I think extending it to 45 or 60 minutes would have been way too overwhelming. As promised, I have posted the slides from that session as well, so folks can copy the examples we used and play around with them on their own.

+

I plan to relax a bit now that the important stuff is over. It was a little too rainy to hit the parties tonight after the bowling shindig, but I hope to engage in a bit more after-hours socialization tomorrow and Tuesday.

diff --git a/export/2007-04-04-naked-again.md b/export/2007-04-04-naked-again.md new file mode 100644 index 0000000..629880a --- /dev/null +++ b/export/2007-04-04-naked-again.md @@ -0,0 +1,15 @@ +--- +title: "Naked again" +date: 2007-04-04 22:06:40 +comments: false +tags: + - "(x)HTML" + - "coding" + - "CSS" + - "design" + - "web standards" +description: "That’s right, we’ve dropped our CSS to celebrate CSS Naked Day . Your turn." +permalink: /archives/naked-again/ +--- + +

That’s right, we’ve dropped our CSS to celebrate CSS Naked Day. Your turn.

diff --git a/export/2007-04-08-ruining-reactions.md b/export/2007-04-08-ruining-reactions.md new file mode 100644 index 0000000..008d0ce --- /dev/null +++ b/export/2007-04-08-ruining-reactions.md @@ -0,0 +1,32 @@ +--- +title: "“Ruining” reactions" +date: 2007-04-08 15:49:36 +comments: false +tags: + - "(x)HTML" + - "accessibility" + - "books & articles" + - "coding" + - "CSS" + - "design" + - "JavaScript" + - "usability" + - "web standards" +description: "There’s been some great discussion surrounding my latest article for A List Apart . It is amazing to see how some people get the idea of progressive enhancement and some just don’t (or perhaps refuse to)." +permalink: /archives/ruining-reactions/ +--- + +

There’s been some great discussion surrounding my latest article for A List Apart. It is amazing to see how some people get the idea of progressive enhancement and some just don’t (or perhaps refuse to).

+

Many folks brought up the point that having requirements for a web-based application is akin to having requirements for a desktop one. I couldn’t agree more, which is why, while discussing some of the shortsighted design choices made by the folks at Lala.com, I said

+
+

For a closed application or service, this might be acceptable, but for a public website it’s a disaster.

+
+

Public website” is the key there. Lala.com is open to the world. To see the homepage, browse around, or search you don’t have to sign up; you don’t have to agree to do anything. The public-facing portion of the website is open to everyone, so it should be open to everyone.

+

When you start talking about closed applications, such as GMail or Basecamp or a CMS, it is acceptable to create a set of requirements for your users. After all, they are choosing to use your service. Keep in mind, however, that even when introducing flashy JavaScript-based functionality you should still be keeping your markup clean and semantic and that you should do your best to follow at least the basic accessibility guidelines, especially if you are charging for your app. Even Google came to the realization that they needed a non-JS version of GMail, so you should never rule out having to go that route. And planning for it from the beginning makes it a hell of a lot easier to implement.

+

One active member of the discussion I’d like to single out is Jean McGuire. Her comments have been very thoughtful and I’d like to take a moment to share a wonderful analogy she made:

+
+

For example, if you owned an outdoor goods store, wouldn’t it be a cool idea to have the entrance on the second floor, and have a climbing wall in front to get to it? That would be new! different! unique! But, even leaving out handicapped accessibility requirements (and how much the UPS guy would hate you) do you think any store owner would be that bloody stupid?

+

Sure, maybe most of the customers would be experienced climbers and would have no problem with the wall. Some might even think it’s fun, not just annoying. But what about the non-climbers shopping for birthday presents for climbers? What about the person who just needs fifty meters of really good rope? What about the person bringing a spouse’s sleeping bag in to get a new zipper? What about the climber with one arm in a cast? For that matter, what about the newspaper reporter coming to do a local business profile on your store? (aka a search engine spider)

+
+

Isn’t that just fantastic?

+

I am really happy that this piece has garnered so many reactions and had gotten people both talking and, most importantly, thinking.

diff --git a/export/2007-05-04-webvisions-wrapped.md b/export/2007-05-04-webvisions-wrapped.md new file mode 100644 index 0000000..3dc738f --- /dev/null +++ b/export/2007-05-04-webvisions-wrapped.md @@ -0,0 +1,24 @@ +--- +title: "WebVisions wrapped" +date: 2007-05-04 16:58:21 +comments: true +tags: + - "(x)HTML" + - "accessibility" + - "coding" + - "design" + - "presentations" + - "progressive enhancement" + - "web standards" +description: "I just wrapped my presentation at WebVisions and have posted the slides for my talk, titled “Learning to Love Forms,” up on SlideShare . I have also embedded them below (though the formatting is a bit off on some of the longer sidebars)." +permalink: /archives/webvisions-wrapped/ +--- + +

I just wrapped my presentation at WebVisions and have posted the slides for my talk, titled “Learning to Love Forms,” up on SlideShare. I have also embedded them below (though the formatting is a bit off on some of the longer sidebars).

+

I’d like to thank everyone who attended and especially those who asked the challenging questions. Hopefully this was a good start to my campaign for getting people to embrace forms instead of running from them.

+

For those of you who couldn’t attend, enjoy the slides. I will post the audio for the session as soon as it’s available.

+

+ + + +

diff --git a/export/2007-06-26-whoops.md b/export/2007-06-26-whoops.md new file mode 100644 index 0000000..df24290 --- /dev/null +++ b/export/2007-06-26-whoops.md @@ -0,0 +1,11 @@ +--- +title: "Whoops…" +date: 2007-06-26 15:58:42 +comments: false +tags: + - "servers" +description: "I was doing a little server cleanup and moved this site’s folder, forgetting to set the new folder up for mod_rewrite, so permalinks have been broken for the last week or so. Everything is better now (I hope). Please let me know if you..." +permalink: /archives/whoops/ +--- + +

I was doing a little server cleanup and moved this site’s folder, forgetting to set the new folder up for mod_rewrite, so permalinks have been broken for the last week or so. Everything is better now (I hope). Please let me know if you notice any lingering issues.

diff --git a/export/2007-06-27-wouldnt-it-be-nice.md b/export/2007-06-27-wouldnt-it-be-nice.md new file mode 100644 index 0000000..556ba0b --- /dev/null +++ b/export/2007-06-27-wouldnt-it-be-nice.md @@ -0,0 +1,16 @@ +--- +title: "Wouldn’t it be nice?" +date: 2007-06-27 20:25:57 +comments: true +tags: + - "CSS" + - "design" + - "web standards" +description: "Over the last two years, I’ve been wishing for just one thing in CSS : rotation. There’s been some discussion about it on the W3C lists, etc. but no one has made a solid pitch for it yet. Inspired a bit by Andy’s modest..." +permalink: /archives/wouldnt-it-be-nice/ +--- + +

+

Over the last two years, I’ve been wishing for just one thing in CSS: rotation. There’s been some discussion about it on the W3C lists, etc. but no one has made a solid pitch for it yet. Inspired a bit by Andy’s modest column-rule-image proposal, I drafted a spec for CSS 3 Rotation [PDF] back in February. I showed it to a few folks at Web Directions North and got some good feedback.

+

Out of those discussions, I also realized we really needed a text-wrapping property in CSS 3, so I also drafted a spec for CSS 3 Polygonal Margins [PDF]. The idea is based on the polygons we used for image maps back in the day and would allow complete control over how text wraps around an element.

+

Anyway, I’ve sat on these for a while and I figured now was as good a time as any to unveil them to the world and solicit your feedback. So what do you think? Could you use this sort of control in your everyday CSS work? How would you want to see it work? Do you think the plans I’ve outlined offer enough flexibility?

diff --git a/export/2007-07-12-new-easy-app-tipr.md b/export/2007-07-12-new-easy-app-tipr.md new file mode 100644 index 0000000..3872773 --- /dev/null +++ b/export/2007-07-12-new-easy-app-tipr.md @@ -0,0 +1,16 @@ +--- +title: "New Easy! app: Tipr" +date: 2007-07-12 18:07:25 +comments: true +tags: + - "business" + - "mobile" + - "projects & products" +description: "Those of you who’ve been paying attention to the apps that came out of iPhoneDevCamp (or who are using Applists , AppMarks , or any of the other iPhone web app aggregators/launchers) are probably already aware, but we just launched our..." +permalink: /archives/new-easy-app-tipr/ +--- + +

Those of you who’ve been paying attention to the apps that came out of iPhoneDevCamp (or who are using Applists, AppMarks, or any of the other iPhone web app aggregators/launchers) are probably already aware, but we just launched our first micro-application named “Tipr” last week.

+

Tipr is a super-simple web-based tip calculator aimed at mobile devices. You simply enter the bill total and choose a percentage to tip and Tipr does the rest, giving you the tip amount and the total. As an added benefit, the total is always converted to a palindrome to make it easy to spot fraudulent adjustment of your tip amount or total when scanning your credit card or bank statement.

+

Tipr started off as a tool for me, really. I have been doing the palindrome thing for over a year now and, even with the calculator on my phone, it takes a minute or so for me to calculate out the tip amount to end up with a palindrome for a total. To speed things up a bit for myself, I built Tipr and it’s proven quite useful. I hope you will find it useful too.

+

We’re trying to keep Tipr pretty simple, but if you have recommendations for improving it, let us know by leaving a comment.

diff --git a/export/2007-07-29-tipr-now-with-added-txt.md b/export/2007-07-29-tipr-now-with-added-txt.md new file mode 100644 index 0000000..e1d8054 --- /dev/null +++ b/export/2007-07-29-tipr-now-with-added-txt.md @@ -0,0 +1,19 @@ +--- +title: "Tipr, now with added txt" +date: 2007-07-29 18:50:19 +comments: true +tags: + - "business" + - "mobile" + - "projects & products" +description: "So, as it turns out, this little app I built for myself is actually useful to other folks." +permalink: /archives/tipr-now-with-added-txt/ +--- + +

So, as it turns out, this little app I built for myself is actually useful to other folks.

+

Over the 3 weeks since it launched, I’ve been keeping an eye on the traffic patterns, reviews, and mentions of Tipr across the intarwebs, but I’ve also been busily adding some new features, which brings me to this post. I knew people with iPhones and other capable mobile browsers were quite happy with Tipr, but folks without a mobile browser or with a sucky one were not, in my opinion, getting as much out of Tipr as I’d like them to. I wanted to correct that.

+

My first thought was to create an SMS service for Tipr, but there’s no way I can afford to rig up a server capable of receiving and replying to SMS messages and I certainly could not afford to pay the $1000-2000/month for an SMS short code (after all, I’m not making any money on this thing). Then the answer dawned on me: Twitter.

+

+

Since Twitter offers an SMS interface (40404 once you register your mobile), I could simply piggy back on their service to offer Tipr via SMS. All I had to do was build a TwitterBot capable of receiving and responding to messages. Lots of folks have built IM bots in the past, but there weren’t that many TwitterBots and there was even less information about building one. Even with the odds stacked against me, however, after about an hour of reading the Twitter API documentation and 6 hours of actual programming, I had built a working PHP-based TwitterBot class.

+

The whole thing works using Twitter’s direct message functionality and runs several independent services to do things like reciprocate friendships, check the inbox, process responses, and send messages back. Unfortunately, the API was only able to get me so far, so I did have to resort to a little hackery to get some of it to work, but in the end, the Tipr TwitterBot, which sits on top of my generic TwitterBot class is pretty solid and quite responsive — even with the 70 API calls in 60 minutes limitation, most messages receive a response in approximately 45 seconds (depending on your network and whether Twitter is releasing a new feature and takes the service offline for a few minutes).

+

Overall, I’m pretty happy with the results and the early beta testers seem to be liking it as well. Hopefully some of you out there will find it as useful (if not more so) than the web interface. If you’re on Twitter, give it a shot and let me know what you think.

diff --git a/export/2007-08-27-a-better-createelementwithname.md b/export/2007-08-27-a-better-createelementwithname.md new file mode 100644 index 0000000..4e03137 --- /dev/null +++ b/export/2007-08-27-a-better-createelementwithname.md @@ -0,0 +1,20 @@ +--- +title: "A better createElementWithName()" +date: 2007-08-27 14:43:59 +comments: true +tags: + - "(x)HTML" + - "coding" + - "JavaScript" +description: "Back in 2005, I wrote a piece about IE ’s abysmal generation of NAME d elements via the DOM (which, interestingly enough, has proven to be one of the most popular posts on the blog, pointing to the fact that this is an obvious pain..." +permalink: /archives/a-better-createelementwithname/ +--- + +

Back in 2005, I wrote a piece about IE’s abysmal generation of NAMEd elements via the DOM (which, interestingly enough, has proven to be one of the most popular posts on the blog, pointing to the fact that this is an obvious pain point for many DOM scripters out there). The the time, I wrote

+

function createElementWithName( type, name ){
var element;
// First try the IE way; if this fails then use the standard way
if( document.all ){
element =
document.createElement( '< '+type+' name="'+name+'" />' );
}else{
element = document.createElement( type );
element.setAttribute( 'name', name );
}
return element;
}
+

It was a complete hack, but it worked. More importantly, however, it began a discussion of a better way to fix the problem in a cross-browser way. The best solution offered was Anthony Lieuallen’s very efficient one-time function definition:

+

function createElementWithName(){}
(function(){
try {
var el=document.createElement( '<div name="foo">' );
if( 'DIV'!=el.tagName ||
'foo'!=el.name ){
throw 'create element error';
}
createElementWithName = function( tag, name ){
return document.createElement( '<' + tag + ' name="' +
name + '"></' + tag + '>' );
}
}catch( e ){
createElementWithName = function( tag, name ){
var el = document.createElement( tag );
// setAttribute might be better here ?
el.name = name;
return el;
}
}
})();
+

And now Brian Adkins has refactored the script to be even fewer lines of code:

+

var createElementWithName = ( function(){
try {
var el = document.createElement( '<div name="foo">' );
if( el.tagName !== 'DIV' || el.name !== 'foo' ){
throw 'create failed';
}
return function( tag, name ){
return document.createElement( '<' + tag + ' name="' +
name + '"></' + tag + '>' );
};
}catch( e ){
return function( tag, name ){
var el = document.createElement( tag );
el.setAttribute( 'name', name );
return el;
};
}
})();
+

Geat job Brian, thanks for sharing.

+

diff --git a/export/2007-09-10-alex-russell-is-not-a-heretic.md b/export/2007-09-10-alex-russell-is-not-a-heretic.md new file mode 100644 index 0000000..3d55f78 --- /dev/null +++ b/export/2007-09-10-alex-russell-is-not-a-heretic.md @@ -0,0 +1,36 @@ +--- +title: "Alex Russell is not a heretic" +date: 2007-09-10 20:39:05 +comments: true +tags: + - "(x)HTML" + - "animation" + - "coding" + - "CSS" + - "design" + - "web standards" +description: "First off, let me preface this by saying I just got back to the East Coast after catching a red-eye from San Francisco on Saturday night, so if I seem a bit incoherent, that’s likely why." +permalink: /archives/alex-russell-is-not-a-heretic/ +--- + +

First off, let me preface this by saying I just got back to the East Coast after catching a red-eye from San Francisco on Saturday night, so if I seem a bit incoherent, that’s likely why.

+

In perhaps the most intellectually-stimulating session at The Rich Web Experience, Alex Russell (of Dojo Toolkit fame) tackled the topic of Standards Heresy.

+

For those who are not aware, Alex was once a staunch standards advocate who has turned to what he considers, “the dark side.” In truth, he’s sick and tied of the dysfunctional nature of the W3C and other similar organizations and I can’t say I blame him. As his session pointed out, the W3C has 60+ paid, full-time staff and yet we saw literally no movement on either (X)HTML or CSS for over five years. That is a travesty.

+

In my opinion, fault lies not with the individuals on the various committees and sub-committees, but rather, with the process. And this isn’t just a problem in the W3C, an organization comprised almost entirely of representatives from the various software vendors (Microsoft, AOL, Opera, etc.) which pay tens of thousands a year to take part. Look at what’s happening with JavaScript 2 in ECMA. Or HTML5 for that matter—“just because it is an “open” organization which “anyone can join” doesn’t make the WHAT WG any better. They are all flawed because the process is flawed, and I think that is Alex’s main point (despite his assertion that the WHAT WG is not dysfunctional).

+

So why is the process flawed? Well, for one, spec writing is largely an academic undertaking. In many cases there are invited experts in a Working Group (such as Andy Clarke in the CSS one), but, for the most part, specs are written by people who are not in the trenches. As Alex rightly points out, in fact, many times, the specs are nothing more than an official blessing of some proprietary technique or technology created by a member company. And good ideas that may be very useful to designers or developers are lost because of internal politics or because a browser vendor thinks it would be “too hard” to implement.

+

Which brings me to Alex’s “heresy.” In his session, he proudly declared himself a heretic because he sees a need for innovation in (X)HTML which is currently unavailable because the specs are not evolving quickly enough. I feel his pain, but I think he is looking at the problem the wrong way. He sees the spec (and web standards in general) as stifling innovation. I see web standards as facilitating innovation. After all, were it not for the firm foundation of well-formed documents and a unified DOM (no matter how piecemeal the implementations), we’d still be writing spaghetti code whenever we tried to do anything with JavaScript. It is because of web standards that we can write clean JavaScript and that we can make truly innovative interactions that take us beyond what is allowed for in the specs themselves.

+

But back to Alex’s complaint…as his example of how Dojo is heretical, he showed this code example:

+

<div dojoType="dijit.form.HorizontalSlider"
name="horizontal1"
onChange="dojo.byId('slider1input').value=arguments[0];"
value="10"
maximum="100"
minimum="0"
showButtons="false"
intermediateChanges="true"
style="width:50%; height: 20px;"
id="slider1">
<!– … –>
</div>
view raw dojo.html hosted with ❤ by GitHub
+

The problem that Dojo is attempting to solve here is the inclusion of a slider form control, which does not exist under the current HTML Forms spec. In order to function, the widget requires several custom attributes to be placed on the element to provide information to assist in the creation of that slider. The fact that Alex “cannot” add these attributes to the document and maintain XHTML validity is a major source of annoyance for him and part of why he has decided that validation is no longer important.

+

But the truth is that Alex could make any or all of these attributes available to whatever elements he wants and still have a validating document by simply creating a DTD (based on any existing (X)HTML one) to include them. After all, the X in XHTML stands for extensible…the language is meant to be improved. Sure, there are some standardistas who think we shouldn’t muck about with the standards, but the powers that be put that extensibility in there for a reason (and I don’t think it was just as a tease). By extending the language to mix in features we desperately need, we drive innovation and, who knows, perhaps someone will take notice and add our extensions into the next version, thereby driving the evolution of the language so many of us desperately want.

+

The same goes for CSS. The W3C made CSS extensible by allowing for custom properties utilizing the -*- syntax. Chances are, you’ve come across this when implementing -moz-border-radius or -webkit-border-radius (which, honestly, both seem superfluous to me when the CSS3 spec includes border-radius as an actual property…why not just support that?). In my mind, this is ripe for use in extension of CSS, not by individual browser vendors (as that is proprietary and closed), but by us (in an open, cross-browser/cross-platform way) to achieve what we want or need that CSS currently does not offer us. That was the basis for my work on gFSS (an experiment in presentational Flash generated from CSS, debuted at Web Directions North this year) and another project I will be releasing in the next few months.

+

So, long story, long, I don’t think that Alex is a heretic. I think he can make a solid case for extending the language (and the interface) of the web for his particular needs (or the needs of his toolkit) as long as he backs it up with documentation in the form of a custom DTD.

+

An aside

+

I do have one problem with what Alex wants, however (or at least what he chooses as a code example) and that problem is the extension of the language, but rather the way in which it was done (i.e. the DIV in his example will degrade to nothing without JavaScript enabled). Dojo supports WAI-ARIA to improve the accessibility of this and many other widgets, which is commendable, but that extra markup is only generated when the Dojo methods are run. If JavaScript is turned off in his example, no form control is available whatsoever.

+

What I’d prefer to see is something like this:

+

<select class="dojo-form-horizontalSlider"
name="horizontal1"
showButtons="false"
intermediateChanges="true"
id="slider1">
<option>0</option>
<option>5</option>
<option selected="selected">10</option>
<!– … –>
<option>100</option>
</select>
view raw slider.html hosted with ❤ by GitHub
+

There’s nothing that could keep Dojo from parsing that bit of XHTML and gleaning from it what it needs to make the slider. And now, when Dojo doesn’t run, there is a degradable interface for the user to adjust the setting. Sure, it may not be nearly as nice, but at least it works. Plus, it allows Dojo to be added as a progressive enhancement, which is what it should be.

+

And to make this valid syntax, the Dojo team just needs to augment the XHTML 1.0 Strict DTD like this to include the custom attributes:

+

<!ENTITY % Boolean
"(true | false)"
>
<!– attributes for Dojo Toolkit
showButtons display buttons (boolean)
intermediateChanges display intermediate steps (boolean)
–>
<!ENTITY % dojo-attrs
"showButtons %Boolean #IMPLIED
intermediateChanges %Boolean #IMPLIED"
>
<!ATTLIST select
%attrs;
name CDATA #REQUIRED
size %Number; #IMPLIED
multiple (multiple) #IMPLIED
disabled (disabled) #IMPLIED
tabindex %Number; #IMPLIED
onfocus %Script; #IMPLIED
onblur %Script; #IMPLIED
onchange %Script; #IMPLIED
%dojo-attrs;
>
view raw dojo.dtd hosted with ❤ by GitHub
+

Sure, it takes a little extra work, but at least it gives users of the Dojo Toolkit the ability to validate their documents, which will help reduce potential CSS and JavaScript conflicts and errors (just one of the many benefits of web standards). Beyond that, it takes advantage of the extensibility of the language to facilitate innovation, and isn’t innovation what we all really want to see?

+

diff --git a/export/2007-10-11-server-side-figurehandler-thoughts.md b/export/2007-10-11-server-side-figurehandler-thoughts.md new file mode 100644 index 0000000..0ec8d02 --- /dev/null +++ b/export/2007-10-11-server-side-figurehandler-thoughts.md @@ -0,0 +1,22 @@ +--- +title: "Server-side FigureHandler thoughts" +date: 2007-10-11 12:19:49 +comments: true +tags: + - "(x)HTML" + - "coding" + - "CSS" + - "design" + - "JavaScript" + - "projects & products" +description: "In reaction to my latest article for A List Apart , on FigureHandler , many folks have boldly claimed that this sort of thing should be done server-side . Below are my thoughts on the matter as posted as a comment in the article’s..." +permalink: /archives/server-side-figurehandler-thoughts/ +--- + +

In reaction to my latest article for A List Apart, on FigureHandler, many folks have boldly claimed that this sort of thing should be done server-side. Below are my thoughts on the matter as posted as a comment in the article’s discussion thread.

+
+

Many of you have brought up that this should be done server-side and, while I agree that it could, it would need to be done in the most flexible way possible (which many won’t bother with). You see, what this script allows quite easily is redesign; a designer can change page layout—“of an entire site or section by section—“without ever having to touch the back-end. It also allows for different columns to receive different figure classification schema.

+

If this were done on the content-entry side (as some have suggested), the image classifications would be stored in the database (or XML or whatever) along with the rest of the content HTML. That means that if the design were to shift to a wider column (for example), the figures that once occupied a half-column, may no longer continue to do so, making the classifications hard-coded in the HTML incorrect.

+

The only way to truly do this flexibly on the back end (as far as I can see) is to leave the classification step to be handled by a function which pre-processes the page output, dynamically assigning the classifications to each figure based on values obtained from the CSS for that page. Essentially, the script would need to go through the same steps as the JavaScript, but it would need to be able to go the extra step of determining applicable CSS rules to obtain the column width. Thankfully, most server-side languages support some means of DOM walking (albeit sometimes in less-than-desirable ways), but, as far as I know, none have a CSS parser, so you’d likely need to write that as well. From a server overhead point-of-view, I imagine that preprocessing would be fairly costly (most DOM-related stuff is), but the output for each page could be cached, reducing it somewhat.

+

If you’re interested in doing something like this, goodonya. I’ve built you a pretty decent roadmap for implementation, but I don’t imagine it will be easy to get it up and running. That said, I wish you luck…it would be yet another great tool for enabling designers to create consistent layouts with figures.

+
diff --git a/export/2008-02-25-automatically-opting-in-to-ie-standards-mode.md b/export/2008-02-25-automatically-opting-in-to-ie-standards-mode.md new file mode 100644 index 0000000..ffc150c --- /dev/null +++ b/export/2008-02-25-automatically-opting-in-to-ie-standards-mode.md @@ -0,0 +1,16 @@ +--- +title: "Automatically opting-in to IE8’s Standards Mode" +date: 2008-02-25 23:18:14 +comments: true +tags: + - "browsers" + - "coding" + - "design" + - "web standards" +description: "As some of you have read (or heard ), WaSP organized a Round Table discussion on IE 8’s standards mode and its default behavior of opting-out any sites that don’t engage in version targeting . We discussed a few different aspects of the..." +permalink: /archives/automatically-opting-in-to-ie-standards-mode/ +--- + +

Web standards ProjectAs some of you have read (or heard), WaSP organized a Round Table discussion on IE8’s standards mode and its default behavior of opting-out any sites that don’t engage in version targeting. We discussed a few different aspects of the issues this presents for standards-aware developers (and progress on the web in general) and discussed a few tacks Microsoft could take to make IE8 more standardista-friendly.

+

One proposal that, to me, appeared to hold the most promise was one that involved extending IE8’s scheme of automatically opting-in unknown valid DOCTYPEs to also include Strict DOCTYPEs of HTML and XHTML currently in use. The current proposal hinges on the relative popularity (or unpopularity) of a given DOCTYPE: unrecognized DOCTYPEs are assumed to be future or custom DOCTYPEs and will automatically be opted-in to the latest and greatest standards mode of any given future version of IE; that is, until that DOCTYPE becomes “popular” enough to warrant associating it with a given version of IE. This, in a nutshell, means that if a new DOCTYPE were to come along after IE8 launches—say, HTML 5—IE8 would render it in standards mode, but if that DOCTYPE became “popular” before IE9 came out, IE9 would likely act as though it was IE8 when rendering those pages.

+

Chris Wilson did not have numbers on the relative popularity of Strict mode DOCTYPEs vs. Transitional and Frameset on either HTML or XHTML, but given that most authoring tools do not automatically generate Strict documents, it is a strong possibility that the popularity of Strict mode DOCTYPEs may make them a candidate for being automatically opted-in to standards mode, at least in IE8. That would be great news for standards-aware developers who want IE8’s standards improvements, but don’t want to engage in version targeting.

diff --git a/export/2009-04-22-were-back-sort-of.md b/export/2009-04-22-were-back-sort-of.md new file mode 100644 index 0000000..f12e236 --- /dev/null +++ b/export/2009-04-22-were-back-sort-of.md @@ -0,0 +1,20 @@ +--- +title: "We’re back (sort of)" +date: 2009-04-22 11:05:51 +comments: true +tags: + - "business" + - "CSS" + - "JavaScript" + - "projects & products" +description: "After making a ridiculously stupid mistake by axing the server that hosted this blog (without checking that I had actually moved it to the new server and without making sure I had a backup of the DB ), Easy! Reader is back. Sort of..." +permalink: /archives/were-back-sort-of/ +--- + +

After making a ridiculously stupid mistake by axing the server that hosted this blog (without checking that I had actually moved it to the new server and without making sure I had a backup of the DB), Easy! Reader is back. Sort of. Thankfully, I had a backup from late ‘06 and I haven’t been an incredibly prolific blogger in the time since that backup. And, thanks to the Internet Archive, it looks like we should be able to recover all but one article (my last post, from about a year ago) from the ether. It may take a little time, but we should have it all up in the next few weeks.

+

So what’s going on? Well, a lot.

+

For one, we’ve relocated from New Haven, CT to Chattanooga, TN after being urged to visit by Mr. Shaun Inman and his lovely bride Leslie and falling in love with this awesome city. We made the move in August of last year and, after spending a few months in an apartment, have bought a house and will be moving in this weekend. Chattanooga is an amazing place. There’s always something going on, it has a wonderful art scene and tech community, and is nestled in the mountains, right along the Tennessee River. It has many of the perks of Portland, OR and San Francisco, CA (other cities we considered moving to), but at 1/4-1/3 the cost. I couldn’t ask for a better place to live.

+

Since relocating, Easy! Designs has also been growing. We’ve taken on two interns – Matt Turnure and Sean McCarthy – and have been joined full-time by both Dave Stewart (who I had previously worked with in CT) and Matt Harris (an excellent developer from the UK), so expect to be hearing from them on this site soon as well. In addition to our client work, we’ve been busily coding away on a few products of our own that should hopefully see the light of day in the coming year. We’re also working on a relaunch of our own website and this blog.

+

Finally, there’s eCSStender. I’ve been working on this project for ages and it’s currently in a closed beta. Things are progressing smoothly on its development though and I expect it will be ready for its initial public release in the coming weeks.

+

Anyway, that’s the nickel tour of the changes. I apologize profusely for rendering this blog pretty useless with my error, but hopefully we’ll have it all back up and running shortly.

+

PS - If you happen to have an archive of my blog post on IE8 Standards Mode, please forward a copy of it to me. I can’t seem to find it even though it appears to be somewhere in the Google Cache.

diff --git a/export/2009-07-09-rip-xhtml2.md b/export/2009-07-09-rip-xhtml2.md new file mode 100644 index 0000000..17c3550 --- /dev/null +++ b/export/2009-07-09-rip-xhtml2.md @@ -0,0 +1,13 @@ +--- +title: "RIP XHTML 2" +date: 2009-07-09 12:46:34 +comments: true +tags: + - "(x)HTML" + - "web standards" +description: "I wasn’t planning to weigh in much on this subject, but I’ve been asked by several people for my thoughts, so here we go…" +permalink: /archives/rip-xhtml2/ +--- + +

I wasn’t planning to weigh in much on this subject, but I’ve been asked by several people for my thoughts, so here we go…

+

This decision by the W3C to not renew the charter for the XHTML 2 Working Group has, rather unfortunately, brought out the worst in the Web standards community. Sure, as a community, we’re prone to holy wars over seemingly inconsequential things—abbr vs. acronym, use vs. abuse of definition lists, etc.—but this move has sparked a particularly ugly fight between proponents of XHTML and its detractors (primarily those folks who think it’s pointless to use XHTML if you aren’t serving it with an XML MIME type).

Personally, I have mixed feelings about the decision. I think there were a lot of good ideas in XHTML 2 (everything can be a link, for one), but it also had a number of shortcomings. I feel much the same about HTML 5; some of the new elements make a lot of sense, but others seem to be solving a problem that really wasn’t there to begin with.

In the end, I think this is probably a good move for the W3C as it will, hopefully, allow them to reallocate resources to projects that need them.

But does it mean I think XHTML is a failure? No.

I think XHTML was a phenomenal success as it made us look at HTML in a new light. It forced us to think about how we marked up documents and applied much-needed pressure on developers to make smarter decisions. Without it, I dare say the Web standards movement would never have gotten as much traction as it did and we would still be in the midst of the browser war started more than a dozen years ago.

diff --git a/export/2009-09-21-getting-tinymce-to-respect-empty-alt-attributes.md b/export/2009-09-21-getting-tinymce-to-respect-empty-alt-attributes.md new file mode 100644 index 0000000..60e8ceb --- /dev/null +++ b/export/2009-09-21-getting-tinymce-to-respect-empty-alt-attributes.md @@ -0,0 +1,16 @@ +--- +title: "Getting TinyMCE to respect empty alt attributes" +date: 2009-09-21 12:36:24 +comments: true +tags: + - "(x)HTML" + - "accessibility" + - "coding" + - "JavaScript" + - "web standards" +description: "This one took a little futzing around and digging through the TinyMCE forum to figure out, but it’s been nagging at me for a while. By default (or at least in the default configuration provided under the LG TinyMCE extension for..." +permalink: /archives/getting-tinymce-to-respect-empty-alt-attributes/ +--- + +

This one took a little futzing around and digging through the TinyMCE forum to figure out, but it’s been nagging at me for a while. By default (or at least in the default configuration provided under the LG TinyMCE extension for ExpressionEngine), TinyMCE will remove the alt attribute if it is empty. Obviously, for accessibility and validation reasons, this is highly undesirable and needs correcting. Thankfully, the fix is pretty simple: add the following to your TinyMCE configuration options:

+ diff --git a/export/2010-01-18-a-new-onload-scheme.md b/export/2010-01-18-a-new-onload-scheme.md new file mode 100644 index 0000000..fd9934c --- /dev/null +++ b/export/2010-01-18-a-new-onload-scheme.md @@ -0,0 +1,22 @@ +--- +title: "A new “onload” scheme" +date: 2010-01-18 21:30:45 +comments: true +tags: + - "coding" + - "JavaScript" + - "web standards" +description: "A few projects back, I decided to rethink our JavaScript organization strategy and came up with a new technique that, I think, helps us better manage behaviors from page to page." +permalink: /archives/a-new-onload-scheme/ +--- + +

A few projects back, I decided to rethink our JavaScript organization strategy and came up with a new technique that, I think, helps us better manage behaviors from page to page.

+

For years, when I needed page-specific interactions, I would either embed the JS (unobtrusively, of course) at the bottom of the page or externalize it to a separate page-specific file. In some sites, that became a difficult setup to manage because we were juggling so many files and we were also forcing our users to download each of those files individually.

+

Looking for a better way to manage all of the code, I built FunctionHandler. This script takes lets you declare blocks of JavaScript and then target them at pages based on the id attribute on the body element. When the targeted id is encountered, the code block is executed on DOM ready. Here’s a quick example:

+

FunctionHandler.register(
['home'],
function(){
alert("I'm gonna run some code here.");
});
view raw gistfile1.js hosted with ❤ by GitHub
+

As you can see, using it is pretty simple: you make a call to FunctionHandler’s register method and pass it two arguments. The first is an array of the id values you want this code block to execute on and the second is an anonymous function that wraps your code block.

+

What we’ve found really nice about this setup is that it encourages you to create discrete JavaScript components while, at the same time, easily allowing you to adjust the pages that those components run on by simply adding to or subtracting from the id stack. You can even blanket every page with a given script by supplying a string value of “*” as the initial argument:

+

FunctionHandler.register(
'*',
function(){
// Typekit
// Google Analytics
// etc.
});
view raw gistfile2.js hosted with ❤ by GitHub
+

Anyway, I just wanted to take a brief moment to share this script because we’ve found it pretty handy. Perhaps you will too.

+

PS - FunctionHandler is available in 3 flavors: native JS, jQuery, and Prototype.

+

diff --git a/export/2010-03-17-counting-the-results-of-a-nested-query-in-expressionengine.md b/export/2010-03-17-counting-the-results-of-a-nested-query-in-expressionengine.md new file mode 100644 index 0000000..d890a34 --- /dev/null +++ b/export/2010-03-17-counting-the-results-of-a-nested-query-in-expressionengine.md @@ -0,0 +1,19 @@ +--- +title: "EE Tip: Counting the results of a nested query" +date: 2010-03-17 18:16:25 +comments: false +tags: + - "content management" + - "databases" + - "ExpressionEngine" + - "MySQL" + - "PHP" +description: "If you’ve built anything remotely challenging in ExpressionEngine, you’ve no doubt discovered things that are easier done in native PHP than in EE tags. A lot of it has to do with how ExpressionEngine parses templates and what gets..." +permalink: /archives/counting-the-results-of-a-nested-query-in-expressionengine/ +--- + +

If you’ve built anything remotely challenging in ExpressionEngine, you’ve no doubt discovered things that are easier done in native PHP than in EE tags. A lot of it has to do with how ExpressionEngine parses templates and what gets parsed first.

+

One recent bugbear I ran into was trying to use the {count} magic” variable from a call to {exp:query} that resided inside a loop. I needed the {entry_id} from the entry in the SQL statement, but {count} (despite being used inside {exp:query}) was evaluating as the count and not the {exp:query} count. To solve the issue, I came up with the following:

+ +

You’ll notice I’m using {exp:query} twice. The first time is to establish a variable in the SQL connection. Then I am free to use the variable in the second query and the count (returned as {query_count}) will be a count of the inner loop instead of the outer one.

+

It is important to note, however, that MySQL will evaluate the variable’s incrementation before paying attention to any ORDER BY clauses, so your mileage may vary. Regardless, it’s a handy technique.

diff --git a/export/2010-04-05-subtree-merge-as-an-alternative-to-submodules-with-git-svn.md b/export/2010-04-05-subtree-merge-as-an-alternative-to-submodules-with-git-svn.md new file mode 100644 index 0000000..3ac2226 --- /dev/null +++ b/export/2010-04-05-subtree-merge-as-an-alternative-to-submodules-with-git-svn.md @@ -0,0 +1,35 @@ +--- +title: "Subtree merge as an alternative to submodules with git svn" +date: 2010-04-05 10:45:21 +comments: false +tags: + - "Git" + - "Subversion" + - "version control" +description: "We use Subversion as our version control system for all client work here at Easy because we absolutely love Springloops’ hosted Subversion service , but we use Git for all of our open source projects because, well, Git is a lot more fun..." +permalink: /archives/subtree-merge-as-an-alternative-to-submodules-with-git-svn/ +--- + +

We use Subversion as our version control system for all client work here at Easy because we absolutely love Springloops’ hosted Subversion service, but we use Git for all of our open source projects because, well, Git is a lot more fun to work with and we love the community that’s built up around Github. In order to have the best of both worlds when working on client projects, we use git-svn as our front-end to Subversion. It’s a great tool, but it’s not without its limitations. One such limitation is its inability to translate Git submodules into svn:externals. Thankfully, Git offers an alternative that is comparable and plays nicely with Subversion: the subtree merge.

+

When attempting to dcommit a Git repository containing a submodule, you’ll likely receive a message like this:

+
+

952bee47201e87b0b0e851bcbe6c8940d429cda0 doesn’t exist in the repository at /usr/local/git/libexec/git-core/git-svn line 3787 Failed to read object 952bee47201e87b0b0e851bcbe6c8940d429cda0 at /usr/local/git/libexec/git-core/git-svn line 480

+
+

That annoying message is the painful reminder that you need to find another way to add content from another project into your repository. Subtree merge to the rescue!

+

If you’ve already hit the error, go ahead and delete your submodule folder(s) and the .gitmodules file and commit the changes to your repository to make the path available again. Next, from a shell within the root of your Git repository enter these commands at the prompt (replacing the capitalized phrases with your relevant information):

+
    +
  1. git remote add -f LOCAL_NAME PATH/TO/GIT/REPOSITORY
  2. +
  3. git merge -s ours --no-commit LOCAL_NAME/BRANCH_NAME
  4. +
  5. git read-tree --prefix=PATH/I/WANT/IT/IN/ -u LOCAL_NAME/BRANCH_NAME
  6. +
  7. git commit -m "Merge of PROJECT"
  8. +
  9. git pull -s subtree LOCAL_NAME master
  10. +
+

To provide a fully fleshed-out example for you, I used the following to merge the master branch of eCSStender into the path vendors/ecsstender within another project.

+
    +
  1. git remote add -f eCSStender git://github.com/easy-designs/eCSStender.js.git
  2. +
  3. git merge -s ours --no-commit eCSStender/master
  4. +
  5. git read-tree --prefix=vendors/ecsstender/ -u eCSStender/master
  6. +
  7. git commit -m "Merge of eCSStender into the vendors directory"
  8. +
  9. git pull -s subtree eCSStender master
  10. +
+

The beauty of this is that you can use that last line to pull in the latest version of the external project and then all you have to do is dcommit the changes to get them into Subversion. Problem solved.

diff --git a/export/2010-07-11-template-based-asset-munging-in-expressionengine.md b/export/2010-07-11-template-based-asset-munging-in-expressionengine.md new file mode 100644 index 0000000..fd0bd7a --- /dev/null +++ b/export/2010-07-11-template-based-asset-munging-in-expressionengine.md @@ -0,0 +1,24 @@ +--- +title: "Template-based Asset Munging in ExpressionEngine" +date: 2010-07-11 12:28:47 +comments: true +tags: + - "content management" + - "CSS" + - "ExpressionEngine" + - "JavaScript" + - "optimization & performance" +description: "In our years of working with ExpressionEngine, we’ve tweaked our standard setup quite a few times. We generally handle most every asset, including CSS and JavaScript, as a template. Being a bit obsessed with organization and overall..." +permalink: /archives/template-based-asset-munging-in-expressionengine/ +--- + +

In our years of working with ExpressionEngine, we’ve tweaked our standard setup quite a few times. We generally handle most every asset, including CSS and JavaScript, as a template. Being a bit obsessed with organization and overall maintainability of code, we separate out our styles and scripts into separate templates for each major concern (e.g. typography, color, screen layout, etc.).

+
+

A while back, it was not uncommon for us to include each of these assets into the document separately, but, as website optimization and performance folks will tell you, all of that separation leads to a lot of additional overhead because the browser must request each of those files individually. In the interest of streamlining the download process, we decided to merge all of the stylesheets together at the template level before sending them over the wire. Here’s the simple recipe we devised:

+

/* ———————————
* Core Stylesheet
* Created by Easy! Designs, LLC
* http://easy-designs.net
* ——————————— */
{embed="styles/reset"}
{embed="styles/typography"}
@media screen {
{embed="styles/layout-screen"}
}
@media print {
{embed="styles/layout-print"}
}
{embed="styles/color"}
{embed="styles/effects"}
view raw main.css hosted with ❤ by GitHub
+

This framework allows us to pull in each template in the optimum way for progressive enhancement with only a single download on the user end, which is much faster. And server-side caching only adds to the speed improvements. Beyond that, we can continue to add new @media blocks (including media queries) as necessary either within the embedded files or in this master one.

+

We use a similar setup for our JavaScript:

+

{embed="javascripts/jquery.FunctionHandler"}
{embed="javascripts/jquery.hoverIntent"}
{embed="javascripts/eCSStender"}
/* Individual page handlers go here */
view raw main.js hosted with ❤ by GitHub
+

In this particular example, we’re including two jQuery plugins: FunctionHandler and hoverIntent, along with eCSStender before adding our page-specific code in FunctionHandler registrations. (jQuery itself is loaded in from Google.)

+

Using ExpressionEngine’s template system to manage the munging like this is dead simple and (from our experience evaluating other people’s EE setups) often underused. Give it a shot on your next project.

+

diff --git a/export/2010-07-19-give-a-hoot.md b/export/2010-07-19-give-a-hoot.md new file mode 100644 index 0000000..326e72e --- /dev/null +++ b/export/2010-07-19-give-a-hoot.md @@ -0,0 +1,27 @@ +--- +title: "Give a hoot" +date: 2010-07-19 19:06:57 +comments: false +tags: + - "coding" + - "JavaScript" + - "optimization & performance" + - "PHP" +description: "As any competent JavaScript knows, it’s not cool to litter the global namespace with variables, functions, and the like. It’s far better to encapsulate your code in an object, a series of objects, or even a closure, exposing only what..." +permalink: /archives/give-a-hoot/ +--- + +

As any competent JavaScript knows, it’s not cool to litter the global namespace with variables, functions, and the like. It’s far better to encapsulate your code in an object, a series of objects, or even a closure, exposing only what you absolutely need to via the global namespace. This helps reduce the potential for collisions that will probably cause your site to break.

+

Occasionally, however, even closures won’t help you trap a given variable. Case in point:

+

(function(){
var a = b = 0;
})();
view raw pollutes.js hosted with ❤ by GitHub
+

If you’re familiar with languages like PHP, you might think this simple closure creates two variables with the same value, but you’d be wrong. It creates a local variable, a and a global variable b, both of which have their value set to 0.

+

<?php
function example()
{
$a = $b = 0;
}
?>
view raw example.php hosted with ❤ by GitHub
+

In exp:easy_gists this means of sharing the value assignment of variables is perfectly legit; the difference, however, is how JavaScript and PHP treat variable scope. By default, every variable declared in PHP is scoped to the function it is called within. Global variables only come into play when you use the global declaration or the $_GLOBALS array. In JavaScript, by contrast, any variables not instantiated with a var are added to the global namespace. Hence the namespace pollution in the above example.

+

Revisiting the closure, it’s best to rewrite it in one of two ways to maintain the variable scope:

+

(function(){
var a = 0, b = 0;
})();
+

or

+

(function(){
var a = 0, b = a;
})();
+

Which solution will work best is dependent solely on context. If you’re minifying the code and the value being assigned is anything more than a single character, the latter is probably the way to go.

+

To help you discover and mitigate pollution in your own scripts (or to help you see what additions your standard JavaScript libraries are making to the global namespace), I’ve created a little script called EmissionsTest.js. It’s pretty easy to use, you simply include it as the first script on your page (preferably in the head of your document) and it does the rest. It will attempt to report its findings to the console (if your browser has one) or it will create a floating notice at the top of the page.

+

You won’t want to include this script on a production site and it’s still pretty basic, but it could be very useful for tracking down any accidental emissions in your script.

+

diff --git a/export/2010-08-02-be-a-good-localstorage-neighbor.md b/export/2010-08-02-be-a-good-localstorage-neighbor.md new file mode 100644 index 0000000..fdfc0b2 --- /dev/null +++ b/export/2010-08-02-be-a-good-localstorage-neighbor.md @@ -0,0 +1,23 @@ +--- +title: "Be a good localStorage neighbor" +date: 2010-08-02 12:17:28 +comments: true +tags: + - "(x)HTML" + - "browsers" + - "coding" + - "databases" + - "JavaScript" + - "projects & products" + - "web standards" +description: "Most JavaScript developers are keenly aware of what they add to the global object and do their best to namespace their work or sequester it in closures . Namespacing and closures reduce the likelihood that necessary functions and..." +permalink: /archives/be-a-good-localstorage-neighbor/ +--- + +

Most JavaScript developers are keenly aware of what they add to the global object and do their best to namespace their work or sequester it in closures. Namespacing and closures reduce the likelihood that necessary functions and variables will be accidentally overwritten, causing errors to be thrown and interfaces to break. Unfortunately, the localStorage API (available in most modern browsers) doesn’t inherently support creating isolated caches for each script because the cache is site-specific and consists simply of key-value pairs. Internet Explorer’s userData behavior (which is available all the way back to IE5) does support sequestering the cache to a degree because you need to provide a name for it, but the API doesn’t make a whole lot of sense and isn’t at all equivalent to localStorage.

+

Using the native APIs, it’s quite easy to accidentally overwrite an existing key in the cache. Beyond that, a simple call to localStorage.clear() will wipe out not only your own data, but anything else stored in the local cache. It’s not good.

+

While working on eCSStender’s implementation of client-side caching, I came to realize the problems with the current state of things and sought to address them by implementing faux namespacing via prefixed keys. I’ve since copied that code out of eCSStender and created a small library named Squirrel.js that not only evens out the differences between localStorage and userData, but also makes it easier to manage your client-side data store in a manner unlikely to cause issues with other scripts also using client-side caching.

+

Here is a quick rundown of how Squirrel.js works:

+

// create a Squirrel instance
var $S = new Squirrel( 'scale-song' );
// write a value to the cache
$S.write( 'doe', 'ray' );
// read it back
$S.read( 'doe' ); // 'ray'
// write a value to a sub-cache
$S.write( 'song', 'doe', 'a dear, a female dear' );
// read back the original value
$S.read( 'doe' ); // 'ray'
// read back the sub-cached value
$S.read( 'song', 'doe' ); // 'a dear, a female dear'
// removing a single property from the sub-cache
$S.remove( 'song', 'doe' );
// try to read the sub-cached value
$S.read( 'song', 'doe' ); // null
// read the root value
$S.read( 'doe' ); // 'ray'
// add some more content to the sub-cache
$S.write( 'song', 'doe', 'a dear, a female dear' );
$S.write( 'song', 'ray', 'a drop of golden sun' );
// clear the whole sub-cache
$S.clear( 'song' );
// check that it's been cleared
$S.read( 'song', 'doe' ); // null
$S.read( 'song', 'ray' ); // null
// check that the root value's still instact
$S.read( 'doe' ); // 'ray'
// remove a property form the main cache
$S.remove( 'doe' );
// check it's value
$S.read( 'doe' ); // null
// write a bit more data in the root and in a sub-cache
$S.write( 'doe', 'ray' );
$S.write( 'song', 'doe', 'a dear, a female dear' );
$S.write( 'song', 'ray', 'a drop of golden sun' );
// clear the whole cache
$S.clear();
// check it's all gone
$S.read( 'song', 'doe' ); // null
$S.read( 'song', 'ray' ); // null
$S.read( 'doe' ); // null
+

For more, check out the Github page. Feel free to let me know your thoughts on how easy it is to use and how it can be improved.

+

diff --git a/export/2010-08-26-honored.md b/export/2010-08-26-honored.md new file mode 100644 index 0000000..b73f6b8 --- /dev/null +++ b/export/2010-08-26-honored.md @@ -0,0 +1,22 @@ +--- +title: "Honored" +date: 2010-08-26 10:51:26 +comments: true +tags: + - "awards" + - "CSS" + - "mobile" + - "web standards" +description: "Late last week, the nominees for the 2010 .net Awards were announced and I was amazed to find myself nominated for not one, but two awards." +permalink: /archives/honored/ +--- + +

Late last week, the nominees for the 2010 .net Awards were announced and I was amazed to find myself nominated for not one, but two awards.

+
+

The first nomination I received is for “Standards Champion.” It’s an absolute honor to have been nominated alongside Derek Featherstone, Molly Holzschlag, Jeremy Keith, Jeffrey Zeldman, and a handful of other folks who’ve not only played a pivotal role in the widespread adoption of web standards, but who’ve also contributed greatly to my development as a web professional and educator.

+
+

The second nomination I received was in the category of “Mobile Site of the Year” for colly.com. As some of you probably know, I wrote a little article on adaptive layouts with media queries for .net’s Summer Issue (#205). In that article, I gave the homepage of Mr. Simon Collision a makeover for the iPad and iPhone. Colly was so excited with what I’d done that he used my changes as a baseline to convert the remainder of his site to employ adaptive layouts. The adaptive redesign turned quite a few heads and became an oft-cited example of adaptive layouts in practice.

+

While I’ve received numerous awards for my work over the years, these two nominations were greatly appreciated. I offer my sincerest thanks to those who put my name in the hat. Win or lose, I can honestly say that I’m a very happy man.

+

Of course it wouldn’t hurt if you voted for me though. ;-)

+

Public voting ends on 12 October 2010.

+

Full disclosure: I am also a judge in the .net Awards (as are several other nominees), but I did no lobbying to become a nominee.

diff --git a/export/2010-12-08-we-built-a-chrome-app.md b/export/2010-12-08-we-built-a-chrome-app.md new file mode 100644 index 0000000..58de22f --- /dev/null +++ b/export/2010-12-08-we-built-a-chrome-app.md @@ -0,0 +1,28 @@ +--- +title: "We Built a Chrome App" +date: 2010-12-08 21:47:52 +comments: false +tags: + - "(x)HTML" + - "browsers" + - "coding" + - "CSS" + - "databases" + - "design" + - "JavaScript" + - "projects & products" + - "web standards" +description: "Yesterday saw the launch of the Chrome App Store and, along with it, a Chrome app we created called the wikiHow Survival Kit ( also available as a web app )." +permalink: /archives/we-built-a-chrome-app/ +--- + +

Yesterday saw the launch of the Chrome App Store and, along with it, a Chrome app we created called the wikiHow Survival Kit (also available as a web app).

+
+

When we were approached to work on this project several months ago, the specs for the creation of a Chrome app were vague at best. We really had no idea what made a “Chrome app” different from a run-of-the-mill web app or even an HTML5 app. All we knew was that the rendering and JavaScript engines were the same for Chrome apps and web apps, but that Chrome would offer some additional “benefits” to apps built for it. What they were, however, remained a mystery.

+

Our client knew they wanted to leverage the bits of HTML5 and CSS3 that Chrome had implemented (and some the Chrome dev team had promised to implement soon, like 3D transforms), but hadn’t really come to any decision on what features would be included, or how the content would be presented. They just knew they wanted it to look amazing.

+

After throwing around several real-life metaphors, such as page turns and the like, we settled on the idea of dealing content from a magical deck of “survival cards”. Our goal was to do as much of the animation as possible using HTML and CSS, relying on JavaScript only when we absolutely necessary, so we could take advantage of the hardware-accelerated animations Google’s Chrome team promised us would also be available in the browser by the time the app store launched.

+

In building this app, we ran Chrome through its paces, uncovering a couple new bugs and pushing the limits of the browser. The project incorporates a lot of cutting-edge tech, including: CSS-based transforms, transitions and animations; web fonts; a client side database; HTML5 semantics; onHashChange events; and an application manifest. Many of these technologies are still in their infancy and finding a reliable, stable way to work with them has proven quite a challenge, but I think we managed to pull it off with only a handful of newly-gray hairs between us.

+

One final issue we ran into was that, when we started this project, you could build and even “compile” a Chrome app, but there was no way to install it because a beta of app store hadn’t even been built yet and was required for installation. We had no way of seeing how a Chrome app would behave differently from a web app. Our questions abounded, but there were few answers to be had. Would there be a location bar? Would there be browser chrome? How would links outside the app function?

+

We had to bide our time and build the app based on how we thought it would work and hope for the best until the app store was ready and we could actually install the app and test it out. Thankfully, as it turned out, installing the Chrome app didn’t really make it all that different from the web app. The only real difference was that, as a Chrome app, the Survival Kit was given a larger offline cache. Oh, and a smaller tab.

+

After the announcement and unveiling of the store yesterday, the interwebs were aflutter with opinions about the significance (or lack thereof) of Google’s choice to create such a store. Sure, the creation of a Chrome app was the impetus for this project, but for us the creation process was exactly the same as we’d use with any project: we built an amazing web app. In truth, I wish we’d had the time and budget to make it usable in any modern browser (unfortunately, that was beyond the scope of our contract), but I am encouraged by the fact that it is available on the web as well, which means you can visit it in any other modern Webkit browser.

+

Daniel will be posting a technical round-up of the project in the coming week and he’ll dig into the details of the project a bit more.

diff --git a/export/2011-02-09-you-cant-rely-on-javascript.md b/export/2011-02-09-you-cant-rely-on-javascript.md new file mode 100644 index 0000000..6270fcb --- /dev/null +++ b/export/2011-02-09-you-cant-rely-on-javascript.md @@ -0,0 +1,30 @@ +--- +title: "Face It: You Can’t Rely on JavaScript" +date: 2011-02-09 05:51:28 +comments: true +tags: + - "accessibility" + - "coding" + - "JavaScript" + - "progressive enhancement" + - "search engine optimization" + - "usability" + - "web standards" +description: "I’ve been cautioning folks against over-reliance on JavaScript for the better part of a decade. In that time, I harped a lot on Lala.com (which was eventually bought by Apple and shuttered) because they loaded all of their content via..." +permalink: /archives/you-cant-rely-on-javascript/ +--- + +

I’ve been cautioning folks against over-reliance on JavaScript for the better part of a decade. In that time, I harped a lot on Lala.com (which was eventually bought by Apple and shuttered) because they loaded all of their content via Ajax. If you showed up to the page with JavaScript disabled, you were greeted with a curt “you must be this high to ride” type message and, my favorite feature, a “loading” indicator:

+

+

Of course, without JavaScript, nothing was loading; the site was devoid of content and completely unusable. Even the search box was pointless as it had no submit button and relied on predictive typing to find anything.

+

That was four years ago. Skip ahead to the relaunch of the Gawker Media platform and you have a company (that should really know better) putting all of their eggs in the JavaScript basket yet again. True, they certainly haven’t been the only ones to launch a site design that relied 100% on JavaScript since Lala, but their epic fail yesterday proved, yet again, that you can’t rely on JavaScript (and Ajax).

+
+

So why can’t you rely on JavaScript? Let’s go through the list:

+
    +
  1. Users may choose to turn JavaScript off in their browser (for performance reasons, as a low-fi way to block pop-ups and ads, or because they ascribe to the age-old misconception that JavaScript is inaccessible).
  2. +
  3. Network administrators may block JavaScript at the firewall (usually because they think it’s insecure).
  4. +
  5. A JavaScript issue as simple as a typo could cause a fatal error that causes JavaScript execution to be aborted completely.
  6. +
  7. In the case of Ajax, the service you are relying on to deliver content to the browser may, itself, experience an error and return nothing or a bunch of error code.
  8. +
+

For these reasons, you should always build your website following progressive enhancement: start with the reliable baseline of HTTP and good copywriting; add semantic HTML (and microformats); apply CSS in layers to create visual hierarchies; use Hijax and other progressively-enhanced JavaScript patterns to improve the interactivity; and cap it off with accessibility enhancements in the form of ARIA roles and states.

+

For musings on the Gawker redesign, progressive enhancement, and JavaScript-focused “hash-bang” URLs, read Jeremy’s excellent post and Mike’s in-depth analysis.

diff --git a/export/2011-05-06-knock-on-wood-pulp.md b/export/2011-05-06-knock-on-wood-pulp.md new file mode 100644 index 0000000..db03fe8 --- /dev/null +++ b/export/2011-05-06-knock-on-wood-pulp.md @@ -0,0 +1,12 @@ +--- +title: "Knock on Wood (Pulp)" +date: 2011-05-06 18:59:11 +comments: false +tags: + - "books & articles" +description: "I was pretty chuffed to see Web Design in a Nutshell , a book I wrote a few chapters for, show up in this incredible video of a book xylophone. Check it out:" +permalink: /archives/knock-on-wood-pulp/ +--- + +

I was pretty chuffed to see Web Design in a Nutshell, a book I wrote a few chapters for, show up in this incredible video of a book xylophone. Check it out:

+
diff --git a/export/2011-05-07-on-requiring-facebook-for-login.md b/export/2011-05-07-on-requiring-facebook-for-login.md new file mode 100644 index 0000000..3027088 --- /dev/null +++ b/export/2011-05-07-on-requiring-facebook-for-login.md @@ -0,0 +1,74 @@ +--- +title: "On Requiring Facebook for Login" +date: 2011-05-07 11:03:58 +comments: true +tags: + - "business" + - "projects & products" + - "usability" +description: "Last night, I had a great conversation on Twitter with Jeff Croft about the pros and cons of requiring a Facebook account for login. It’s a trend that seems to be on the rise and I, personally, don’t think it’s a good long term strategy." +permalink: /archives/on-requiring-facebook-for-login/ +--- + +

Last night, I had a great conversation on Twitter with Jeff Croft about the pros and cons of requiring a Facebook account for login. It’s a trend that seems to be on the rise and I, personally, don’t think it’s a good long term strategy.

+

It all started when I visited Home Elephant on a tip from GOOD magazine. It looks like an interesting service, but, as I have chosen not to create a Facebook account, is not something I can sign up for. And so it sparked this tweet:

+
+

AaronGustafson reminds you that not everyone has (or wants) a @facebook account. You’re limiting your reach by requiring one for sign up. /cc @HomeElephant

+
+

It prompted the following response from Jeff:

+
+

I’ll remind you that the limited reach may be a perfectly acceptable business decision, given other trade-offs.

+
+

To which I replied:

+
+

AaronGustafson agrees, @jcroft, a business may want to limit signups when starting up, but is 500 million potential users really a limit?

+
+

Jeff clarified his position (this was a series of tweets I’ve combined here so it’s easier to follow):

+
+

No, that’s not what I meant. What I meant was: Facebook offers you things you can’t get elsewhere. Those might be essential to your product. Or, it may not be worth the effort to build a version of your product that works without them. You said, “you’re limiting your reach.” I’m saying, “yeah, but I may be increasing my bottom line.” Sometimes doing what it takes to support standalone accounts means putting an excessive amount of resources into something.

+
+

I agreed with his position… to a point:

+
+

AaronGustafson sees your point, @jcroft. Still, it seems like there are services that offer options in addition to Facebook if you’re looking for shortcuts

+
+

To which he responded (combined from a short series of tweets):

+
+

What other service offers a social graph and auth? And if I used it, would you just complain that I require THAT service? If I need to accommodate people who don’t want to use Facebook (or whatever), I now have to build my own auth and graph which may be too costly. It’s a business decision. Maybe I’ll trade the extra users for the cost associated with them.

+
+

My reply (again, combined from two tweets):

+
+

AaronGustafson wouldn’t complain about a service that allows people who use different services to register/log in @jcroft. @janrain, for instance. AaronGustafson used @janrain in building the @StandardsSherpa site (in addition to supporting local system accounts). There’s a free version.

+
+

His response:

+
+

All I’m saying is, you have to balance business goals with technical and user goals. Multiple sign-ins is a user goal.

+
+

In a little cross-conversation, I replied

+
+

AaronGustafson agrees about finding a balance, @jcroft. But from a user’s perspective, being excluded from a service because you don’t is a turn off.

+
+

Jeff continued (again, from a series of tweets):

+
+

JanRain is great, but it’s only for auth, right? Doesn’t have graph, events, pages, groups, photos, etc. that FB has. If those things are essential to your app, you either use FB, or build them yourself (again, increasing cost). Really, I’m just making an observation that us UX people often forget that there are business goals, as well, and sometimes they conflict with the UX goals. Sometimes there are tradeoffs between optimal UX and cost.

+
+

And Jeff’s response to my earlier comment:

+
+

Yep, I agree. Just pointing out, we don’t know WHY that site requires Facebook. Maybe they totally agree that it’d be better if they didn’t, but doing so would have cost them millions. I dunno.

+
+

My response:

+
+

AaronGustafson agrees, @jcroft. There is a trade-off, but there are a lot of cons to @Facebook from a user perspective (many of which are privacy-related). They could have a valid reason, but it I think many companies take that route because they think “everyone” is on @Facebook

+
+

I then offered an anecdote (and a thank you):

+
+

AaronGustafson found that on @StandardsSherpa, with 6 login options + local, the split was 50-50 local vs oAuth. And that’s with a tech crowd. Aaron Gustafson enjoyed that discussion @jcroft. Thank you!

+
+

Jeff had a similar experience:

+
+

Yeah, I get the same thing on Lendle. About 50% sign up without FB or Twitter. But, it DID take me more time to allow that. For me, it was worth it. But if others come to another conclusion, that’s cool, too. And yeah, great discussion! :)

+
+

Anyway, I just thought it was worth preserving and sharing that conversation with all of you as login/auth and Facebook integration is a hot topic right now. Coincidentally, GOOD published an article yesterday about why one of their senior editors is not on Facebook.

+

Oh, and Home Elephant got back to me at the end of my conversation with Jeff:

+
+

@AaronGustafson Working our tails off right now for non-FB signups. Stay tuned…

+
diff --git a/export/2011-06-21-i-finally-wrote-a-book.md b/export/2011-06-21-i-finally-wrote-a-book.md new file mode 100644 index 0000000..01e2cd4 --- /dev/null +++ b/export/2011-06-21-i-finally-wrote-a-book.md @@ -0,0 +1,30 @@ +--- +title: "I (Finally) Wrote a Book" +date: 2011-06-21 10:27:12 +comments: true +tags: + - "(x)HTML" + - "books & articles" + - "browsers" + - "coding" + - "CSS" + - "JavaScript" + - "mobile" + - "progressive enhancement" + - "projects & products" + - "usability" + - "web standards" +description: "Over the last five years, one of the most frequent questions I’ve gotten has been “When are you going to write a book? ”" +permalink: /archives/i-finally-wrote-a-book/ +--- + +

Over the last five years, one of the most frequent questions I’ve gotten has been “When are you going to write a book?

+

You see, I’ve been writing articles and contributing to other people’s books since some time in 2004. In that time, I’ve also presented at dozens of conferences on a variety of topics. The topics I’ve chosen for these endeavors have been all over the map, but the theme that seemed to link them all was progressive enhancement. It’s the philosophical underpinning of everything I do and a subject that gets me excited about coming to work every day. So, naturally, I decided to make that the topic of my first solo book: Adaptive Web Design: Crafting Rich Experiences with Progressive Enhancement.

+
+

This book has been writing itself in my head for the last four or five years, so it was a great feeling to finally commit it to “paper” when I began this process a little over a year ago. My primary goal in writing the book was to thoroughly explain what progressive enhancement is, why it works, and how to use it. It is not meant to be a technique book.

+

My reasoning is simple: People understand techniques that apply progressive enhancement, but techniques come and go (just like browsers do). With a solid understanding of the philosophy and mechanisms of progressive enhancement, our community will be better able to build adaptive websites that truly serve our users. In many ways, I strove to write a philosophy book. A philosophy book with code.

+

Anyway, after about a year of writing, editing, and production (coupled with the formation of a new publishing house, etc.), I’m very happy to have the book out there and into your hands. So far, your response has been overwhelmingly positive, making me feel like this is the book I was meant to write.

+

I thanked a lot of people in the book itself, but I wanted to take another moment to sincerely thank my team here at Easy Designs for their feedback during the writing process and their relentless pursuit of perfection when it came to everything from paper choice, to book design, to finding our printer, and even for their care in the packaging and shipping of my precious baby. They did amazing work considering we’re primarily a web shop. I’d also like to thank Krista Stevens for her amazing editorial guidance, Veerle Pieters for her gorgeous cover design, and Jeffrey Zeldman for his wonderful foreword.

+

Oh, and I’m going to close out this post with one final tidbit about the print version of my book: not only is it printed on 50% recycled paper, but nearly every component of the book and its packaging (including our shipping boxes), was created within 2 hours of Chattanooga, TN (where we’re based). We created a beautiful product and kept every aspect of its production local. I think that’s pretty amazing.

+

If you haven’t grabbed a copy yet, we’re selling paperbacks, eBooks (ePub, Mobi, and PDF), and combo packs on the Easy Readers website and the Kindle edition just went on sale at Amazon.

+

Happy reading!

diff --git a/export/2011-07-12-now-read-this-1.md b/export/2011-07-12-now-read-this-1.md new file mode 100644 index 0000000..f99fb9e --- /dev/null +++ b/export/2011-07-12-now-read-this-1.md @@ -0,0 +1,23 @@ +--- +title: "Now Read This I" +date: 2011-07-12 20:47:17 +comments: false +tags: + - "(x)HTML" + - "books & articles" + - "conferences" + - "humor" +description: "I find a lot of cool links throughout the week and I usually bookmark them on some service, like Pinboard , but for some reason I never considered posting them to the blog. I’m recitfying that as of today." +permalink: /archives/now-read-this-1/ +--- + +

I find a lot of cool links throughout the week and I usually bookmark them on some service, like Pinboard, but for some reason I never considered posting them to the blog. I’m recitfying that as of today.

+

Here are last week’s finds:

+ diff --git a/export/2011-07-14-retreat-remembered.md b/export/2011-07-14-retreat-remembered.md new file mode 100644 index 0000000..5bcb980 --- /dev/null +++ b/export/2011-07-14-retreat-remembered.md @@ -0,0 +1,41 @@ +--- +title: "Retreat, remembered" +date: 2011-07-14 10:51:00 +comments: false +tags: + - "business" + - "conferences" + - "presentations" + - "projects & products" +description: "First off, this post has been way too long in the making. I should have written it a couple months ago, but that’s the thing about running things, you don’t always have time to come up for air. Anyway, without further ado…" +permalink: /archives/retreat-remembered/ +--- + +

+ First off, this post has been way too long in the making. I should have written it a couple months ago, but that’s the thing about running things, you don’t always have time to come up for air. Anyway, without further ado…

+
+
+

+ As many of you know, we officially launched our new training series, Retreats 4 Geeks, with a HTML5 & CSS3 retreat co-led by Eric Meyer and yours truly. The event was held in an amazing cabin on the side of a mountain in Gatlinburg, TN and, as you can probably guess, it was downright magical.

+

+ I know, it’s my event series, so of course I'm gonna gush, but it really was so much better than even I could have imagined, and here’s why:

+

+ The location

+

+ Yeah, some people laughed when they heard we were going to run a tech event in Gatlinburg, but it really was a fantastic location. Sure, it's a tiny little town tucked in the middle of the mountains, but that “isolation” (though we had crystal clear cell reception and wifi with decent bandwidth) really helped us get the daily stress out of our heads so we could focus on learning and collaborating. Plus, where else can you go for a tasting of moonshine, try a deep-fried Oreo and then walk  down the road for a game of ”Hillbilly Golf”?

+

+ The venue

+

+ Kelly outdid herself when she found our lodge. It was just the right size, offered gorgeous views, and had plenty of amenities to keep us all relaxed and happy for the duration of the retreat. As an added bonus, there was even a family of bears in the neighborhood that paid us a visit to everyone’s disbelief and delight! (Yes, the bears are now on Flickr.)

+

+ My co-lead

+

+ Eric Meyer is an amazing guy. Not only is he one of the smartest people working on the web today, but he's also an incredibly nice fellow and a helluvalot of fun to hang out with. His sessions were awesome, often mind-blowing (for me too), and very practical. He was also an excellent mentor (as though I expected any less of him) and was really the perfect co-lead for our inaugural retreat. Plus I had a great time working on our silly little CSS transforms tool.

+

+ Last, but by no means least, the people

+

+ I am by no means blowing smoke when I say that we had an incredible group of attendees. Everyone got along really well and they were just damn fun to be around, whether we were in the classroom, sitting around the dinner table, or trying to guide our very tall van through the gauntlet of a low-roofed parking garage. I was so sad to bid them farewell on the final day (and I know they were too). If it's any testament as to how awesome they are, I'm actually getting misty thinking about them.

+

+ All told, we felt the retreat was a huge success and are looking forward to organizing next year’s events (to be announced soon!). I’d like to personally extend a huge thank you to Kelly and Jessica for organizing the whole thing and keeping everything running smoothly, to Eric for being an incredible co-trainer, and to all of the attendees for taking a chance on a new idea and making it an incredible experience for all involved.

+

+ Oh, and one last thing, for the hands-on project on the third day of the event, our attendees built a pretty stellar tribute to Johnny Cash using all of the HTML5 & CSS3 knowledge they gained over the previous two days. I was really happy with the fruits of their labor, especially considering they organized themselves and managed to build it all in less than a day. Great job yet again guys!

diff --git a/export/2011-07-18-now-read-this-2.md b/export/2011-07-18-now-read-this-2.md new file mode 100644 index 0000000..ff58b4e --- /dev/null +++ b/export/2011-07-18-now-read-this-2.md @@ -0,0 +1,25 @@ +--- +title: "Now Read This II" +date: 2011-07-18 21:09:56 +comments: false +tags: + - "coding" + - "design" + - "mobile" +description: "Hot links last week:" +permalink: /archives/now-read-this-2/ +--- + +

Hot links last week:

+ diff --git a/export/2011-07-25-now-read-this-3.md b/export/2011-07-25-now-read-this-3.md new file mode 100644 index 0000000..4d2c1c0 --- /dev/null +++ b/export/2011-07-25-now-read-this-3.md @@ -0,0 +1,32 @@ +--- +title: "Now Read This III" +date: 2011-07-25 13:09:49 +comments: true +tags: + - "books & articles" + - "business" + - "CSS" + - "culture & society" + - "design" + - "internationalization & localization" + - "mobile" + - "optimization & performance" + - "search engine optimization" +description: "We found some awesome links last week:" +permalink: /archives/now-read-this-3/ +--- + +

We found some awesome links last week:

+ diff --git a/export/2011-08-01-now-read-this-4.md b/export/2011-08-01-now-read-this-4.md new file mode 100644 index 0000000..32f3be9 --- /dev/null +++ b/export/2011-08-01-now-read-this-4.md @@ -0,0 +1,36 @@ +--- +title: "Now Read This IV" +date: 2011-08-01 18:43:43 +comments: false +tags: + - "business" + - "design" + - "mobile" + - "progressive enhancement" + - "usability" +description: "Awesome stuff last week from the worlds of typography, mobile, customer service, and design." +permalink: /archives/now-read-this-4/ +--- + +

Awesome stuff last week from the worlds of typography, mobile, customer service, and design.

+ diff --git a/export/2011-08-04-experimenting-with-grids-using-ecsstender.md b/export/2011-08-04-experimenting-with-grids-using-ecsstender.md new file mode 100644 index 0000000..07c4fac --- /dev/null +++ b/export/2011-08-04-experimenting-with-grids-using-ecsstender.md @@ -0,0 +1,31 @@ +--- +title: "Experimenting with Grids Using eCSStender" +date: 2011-08-04 19:59:13 +comments: true +tags: + - "browsers" + - "CSS" + - "design" + - "JavaScript" + - "projects & products" + - "web standards" +description: "In preparation for the launch of 10K Apart (Responsive Edition) from Mix Online and An Event Apart , I’ve been feverishly working on a modest implementation of the proposed CSS 3 Grid Layout module (also referred to as Grid Alignment in..." +permalink: /archives/experimenting-with-grids-using-ecsstender/ +--- + +

In preparation for the launch of 10K Apart (Responsive Edition) from Mix Online and An Event Apart, I’ve been feverishly working on a modest implementation of the proposed CSS3 Grid Layout module (also referred to as Grid Alignment in alternate drafts) using eCSStender. As you might imagine, it was a pretty massive undertaking, but it’s been pretty rewarding to use eCSStender for it’s original intent: prototyping implementations of proposed specifications.

+
+

As it stands today, the IE10 platform preview is the only place you can play with CSS-based grid layouts and have them natively rendered by the browser. This JavaScript-based port, however, makes it possible to view them in recent builds of Chrome, Firefox, Safari and even Opera.

+

How do I use it?

+

As with most extensions, enabling grid layout with eCSStender is as simple as including the eCSStender core library and the Grid Alignment extension. It’s up to you whether you’d like to download copies of each and serve them from your own domain or whether you prefer to use the new eCSStender CDN to handle the file delivery for you. To use the CDN, you’d simply include the following two scripts just before the closing body tag:

+

<script src="http://cdn.ecsstender.org/lib/latest/min/eCSStender.js"></script>
<script src="http://cdn.ecsstender.org/ext/CSS3-grid-alignment/latest/min/eCSStender.CSS3-grid-alignment.js"></script>
+

With those in place, you can begin playing with some of the new grid syntax. The test case I built the extension against sets up one of several grids like this:

+

#demo {
width:945px;
display:grid;
grid-columns: 145px 1fr 145px 1fr 145px 1fr 145px 1fr 145px 1fr 145px;
grid-rows: 186px 692px 357px;
}
view raw grid-object.css hosted with ❤ by GitHub
+

That code builds a grid structure within #demo that is comprised of six primary columns (at 145px wide each) with equal-width “gutter” columns between them (their width is determined using equal division of the remaining space as indicated by “1fr” meaning “one fraction”). It also defines three rows of varying heights within #demo. Grid items are then positioned on the grid using coordinate-like syntax:

+

#articles {
grid-row:2;
grid-column:1;
}
view raw grid-item.css hosted with ❤ by GitHub
+

The full CSS for the Grid System demo can be viewed here.

+

The extension is by no means complete (the spec is fairly large and will require months to build an exhausitve implementation), but it does let you begin experimenting with the syntax immediately.

+

Is This the Future of Grid-based Web Design?

+

To me, this spec is very much in it’s infancy (despite having an experimental implementation in the IE10 platform preview). The fine folks at Microsoft (who, in full disclosure, funded the development of the extension) are keen to get people playing with the proposed syntax. I, for one, am not completely sold on the syntax as it is currently proposed. From a developer’s standpoint it makes sense because it feels like you’re building an invisible series of rows and columns, onto which you are attaching pieces of your document. As Mark Boulton pointed out on Twitter, however, the syntax is not as analogous to a designer’s concept of a grid, which could slow the adoption of the spec if it were to be finalized as-is.

+

Despite obviously working my tail off to get this extension up and running, I’m most excited to see how the conversation opens up on grid-based CSS layouts and how the spec evolves. Will there be counterproposals? Yup. Mark’s working on one and I’d love to see more. All I know is that the more we—the grunts in the CSS-authoring trenches—get involved in the spec development process, the better the end result will be. And I’m looking forward to helping turn even more ideas into workable prototypes using eCSStender.

+

diff --git a/export/2011-08-09-now-read-this-v.md b/export/2011-08-09-now-read-this-v.md new file mode 100644 index 0000000..3fb902a --- /dev/null +++ b/export/2011-08-09-now-read-this-v.md @@ -0,0 +1,34 @@ +--- +title: "Now Read This V" +date: 2011-08-09 16:20:28 +comments: false +tags: + - "browsers" + - "coding" + - "culture & society" + - "design" + - "internationalization & localization" + - "JavaScript" + - "mobile" + - "optimization & performance" + - "presentations" + - "web standards" +description: "Last week the U.S. teetered on the brink of economic collapse, but there were lots of goodies to think about and share too. Here’s a smattering of things to check out on the topics of design, development, and social commentary:" +permalink: /archives/now-read-this-v/ +--- + +

Last week the U.S. teetered on the brink of economic collapse, but there were lots of goodies to think about and share too. Here’s a smattering of things to check out on the topics of design, development, and social commentary:

+ diff --git a/export/2011-08-11-what-do-you-look-for-in-an-browser-based-rich-text-editor.md b/export/2011-08-11-what-do-you-look-for-in-an-browser-based-rich-text-editor.md new file mode 100644 index 0000000..c8a3524 --- /dev/null +++ b/export/2011-08-11-what-do-you-look-for-in-an-browser-based-rich-text-editor.md @@ -0,0 +1,46 @@ +--- +title: "What Do You Look For in an Browser-based Rich Text Editor?" +date: 2011-08-11 20:44:55 +comments: false +tags: + - "client relations" + - "content management" + - "projects & products" + - "usability" +description: "As a content-focused web development firm, we’re frequently called on to design and build content management systems for our clients. More often than not, that involves using some form of in-browser rich text editor ( RTE ) to more..." +permalink: /archives/what-do-you-look-for-in-an-browser-based-rich-text-editor/ +--- + + + + + + + +

As a content-focused web development firm, we’re frequently called on to design and build content management systems for our clients. More often than not, that involves using some form of in-browser rich text editor (RTE) to more easily allow users to add and edit content. Over the years we’ve used a number of different scripts and libraries to do this—everything from TinyMCE and CK Editor to our own home-grown solutions. The thing is that we’ve never felt 100% happy with any software we’ve used.

+ + + + + + +

Before we begin yet another project that will involve a RTE (possibly one we roll ourselves), we thought it might be good to ask around to get some feedback from you and/or your clients. We’ve put together two short surveys—one aimed at developers, the other at authors/content folks—to help us better understand what you do with RTEs and what your preferences are with respect to their overall user interface.

+ + + + + + +

If you have 5 minutes (and we promise that’s all it will take), we’d love for you to take one of the following two surveys:

+ +

Thanks in advance for your help. When we finish with the survey, we’ll be happy to share the findings.

+ diff --git a/export/2011-08-15-now-read-this-6.md b/export/2011-08-15-now-read-this-6.md new file mode 100644 index 0000000..cd2ef6e --- /dev/null +++ b/export/2011-08-15-now-read-this-6.md @@ -0,0 +1,38 @@ +--- +title: "Now read this VI" +date: 2011-08-15 15:45:18 +comments: false +tags: + - "browsers" + - "CSS" + - "culture & society" + - "Git" + - "mobile" + - "optimization & performance" + - "usability" + - "version control" + - "web standards" +permalink: /archives/now-read-this-6/ +--- + +
We found some great design, UX, optimization, and innovation links for you last week. Enjoy!
+ diff --git a/export/2011-08-22-now-read-this-7.md b/export/2011-08-22-now-read-this-7.md new file mode 100644 index 0000000..9c027e0 --- /dev/null +++ b/export/2011-08-22-now-read-this-7.md @@ -0,0 +1,38 @@ +--- +title: "Now Read This VII" +date: 2011-08-22 11:27:58 +comments: false +tags: + - "browsers" + - "client relations" + - "coding" + - "CSS" + - "culture & society" + - "design" + - "Git" + - "humor" + - "JavaScript" + - "mobile" + - "version control" +description: "In this week’s link round-up, we bring you the reasoning behind Typekit’s font-loading strategy, two new CSS grid systems, a fantastic mash-up of Peanuts with Jaws, and the story of a woman ejected from a Houston bar for tweeting..." +permalink: /archives/now-read-this-7/ +--- + +

In this week’s link round-up, we bring you the reasoning behind Typekit’s font-loading strategy, two new CSS grid systems, a fantastic mash-up of Peanuts with Jaws, and the story of a woman ejected from a Houston bar for tweeting something the General Manager didn’t like. Happy reading!

+ diff --git a/export/2011-08-29-now-read-this-8.md b/export/2011-08-29-now-read-this-8.md new file mode 100644 index 0000000..0a46272 --- /dev/null +++ b/export/2011-08-29-now-read-this-8.md @@ -0,0 +1,32 @@ +--- +title: "Now Read This VIII" +date: 2011-08-29 11:49:36 +comments: true +tags: + - "(x)HTML" + - "animation" + - "CSS" + - "culture & society" + - "design" + - "JavaScript" + - "mobile" + - "progressive enhancement" + - "web standards" +description: "Last week, Steve Jobs resigned and Twitter launched a resource for quickly building prototypes and apps. Here’s the roundup of those and other links you’ll want to check out:" +permalink: /archives/now-read-this-8/ +--- + +

Last week, Steve Jobs resigned and Twitter launched a resource for quickly building prototypes and apps. Here’s the roundup of those and other links you’ll want to check out:

+ diff --git a/export/2011-09-12-now-read-this-9.md b/export/2011-09-12-now-read-this-9.md new file mode 100644 index 0000000..c58d9eb --- /dev/null +++ b/export/2011-09-12-now-read-this-9.md @@ -0,0 +1,47 @@ +--- +title: "Now Read This IX" +date: 2011-09-12 14:04:12 +comments: false +tags: + - "browsers" + - "business" + - "coding" + - "culture & society" + - "design" + - "iOS" + - "JavaScript" + - "mobile" + - "web standards" +description: "Apologies for not posting links last week…between the holiday on Monday and spending all of Tuesday without power, I got a little behind. Here are some highlights from the last two weeks:" +permalink: /archives/now-read-this-9/ +--- + +

Apologies for not posting links last week…between the holiday on Monday and spending all of Tuesday without power, I got a little behind. Here are some highlights from the last two weeks:

+ diff --git a/export/2011-09-29-on-redirecting-mobile-traffic.md b/export/2011-09-29-on-redirecting-mobile-traffic.md new file mode 100644 index 0000000..7d5c9f8 --- /dev/null +++ b/export/2011-09-29-on-redirecting-mobile-traffic.md @@ -0,0 +1,27 @@ +--- +title: "On Redirecting Mobile Traffic" +date: 2011-09-29 11:24:34 +comments: false +tags: + - "business" + - "mobile" + - "usability" +description: "While perusing the latest Costco email, I stumbled onto a pretty sweet looking mini-greenhouse and decided to click through to read more about the product. Unfortunately for me, I was on my phone and Costco is not particularly savvy..." +permalink: /archives/on-redirecting-mobile-traffic/ +--- + +

While perusing the latest Costco email, I stumbled onto a pretty sweet looking mini-greenhouse and decided to click through to read more about the product. Unfortunately for me, I was on my phone and Costco is not particularly savvy about how they handle the redirection of mobile traffic to their “mobile friendly” site. Instead of landing on the product page as I should have, I was redirected to the mobile landing page.

+
+

Now I am not completely sold on the need for creating (and maintaining) an independent mobile version of every website; sometimes it makes sense, but other times it’s overkill. That said, however, I am sure of one thing: if you do redirect mobile traffic, make sure you do so to an equivalent URI; don’t redirect all requests to the homepage. When users click a link, they have an expectation of what they will find on the other end of that link. Make sure you meet that expectation.

+

I can’t speak to Costco’s server setup specifically because it seems their main site is .Net and the mobile version is JSP/Struts (neither of which are my cup of tea), but for those of you running Apache (which more than half of you likely are), setting up proper redirection is relatively easy using an .htaccess file:

+

# setup
RewriteEngine on
RewriteBase /
# product redirection based on iPhone and product request
# note: the user agent check is contrived, you should have
# a much more robust checker
RewriteCond %{HTTP_USER_AGENT} iPhone
RewriteCond %{QUERY_STRING} Prodid=(\d+)
RewriteRule ^Browse\/Product.aspx http://m.costco.com/costco/product/productDirectDetail.do?itemId=%1 [R=301,L]
view raw .htaccess hosted with ❤ by GitHub
+

Here’s what this snippet does:

+ +

With this sort of simple redirection in place, you can ensure users get where they want to go, quickly and easily.

+

diff --git a/export/2011-10-12-from-mobile-friendly-to-mobile-first.md b/export/2011-10-12-from-mobile-friendly-to-mobile-first.md new file mode 100644 index 0000000..19edb3a --- /dev/null +++ b/export/2011-10-12-from-mobile-friendly-to-mobile-first.md @@ -0,0 +1,60 @@ +--- +title: "From “Mobile Friendly” to “Mobile First”" +date: 2011-10-12 11:33:00 +comments: true +tags: + - "(x)HTML" + - "accessibility" + - "browsers" + - "coding" + - "content management" + - "design" + - "ExpressionEngine" + - "mobile" + - "optimization & performance" + - "progressive enhancement" + - "projects & products" + - "usability" + - "web standards" +description: "If you’re reading this on a desktop browser, you may not have noticed, but we just turned this blog on it’s head, design-wise. Those of you browsing on a tablet or mobile device, however, should be enjoying a much more comfortable..." +permalink: /archives/from-mobile-friendly-to-mobile-first/ +--- + +

+ If you’re reading this on a desktop browser, you may not have noticed, but we just turned this blog on it’s head, design-wise. Those of you browsing on a tablet or mobile device, however, should be enjoying a much more comfortable reading experience. Now that’s not to say that we’ve been giving mobile the short end of the stick before pushing out the new code, but our approach to mobile has changed drastically since the redesign of this blog early last year and we’re really happy to be able to bring the lessons we’ve learned back here.

+

+ Our initial approach to this site involved building out the desktop view as the default layout. We then used CSS3 media queries to “dumb down” the content by removing some page components and reconfiguring some of the content. It worked pretty well, but we were relying on our mobile users to have media query support in their device. Given that our audience tends to have more capable mobile devices, it probably wasn’t an incorrect assumption, but it wasn’t all that kind to people who happen to have less-capable devices. And, in truth, it didn’t sit well with us because it didn’t really jive with the progressive enhancement philosophy I advocate so hard for. That said, the world of media queries was pretty new at the time. 

+

+ We know better now.

+

+ In the past year, our approach to mobile has become much more nuanced as we embraced the “mobile first” idea Luke Wroblewski has been pushing for (and recently wrote a book about). The idea of “mobile first” is that you optimize your site for use in a mobile context and then layer on additional styles, JavaScript, and content as you find you have more real estate to work with or a more capable device.

+

+ In terms of media queries, it meant switching the layout around to be mobile when media query support was lacking and then making tweaks to the styes as the browser’s width exceeded certain milestones (using min-width values rather than the max-width ones we’d been using previously to skrink the site). On the JavaScript end, it meant witholding certain scripts until we knew they’d be useful (the comment preview, for instance, is terribly distracting on a mobile device). These were techniques we’d already put into practice on other sites, but that we had not gotten around to applying here.

+

+ The biggest difference most people will see is the visual one. In its previous incarnation, this site only supported 2 resolutions: wide and narrow. It didn’t matter is you were on a tablet or a handheld device; you were getting the mobile layout. Under the new setup, the layout is far more nuanced, adjusting roughly 6 times. Some of the adjustments are subtle (such as relocation of search from the footer to the header as you cross the 570px mark), but others are more substantial (such as the introduction of sidebar content at 651px). Below are a series of screenshots depicting the differences between the two approaches at different milestones.

+
+
A comparison of the narrow layout of this blog. The differences are pretty subtle, mostly having to do with spacing.
+
+
A comparison of the small screen layout of this blog. Small tablet and Kindle users benefit from slightly larger text.
+
+
A comparison of the mid-size layout of this blog. Slightly larger tablets will get a sidebar (as appropriate) and the line lengths are a little better.
+
+
A comparison of the wide/desktop layout of this blog. There are some minor spacing differences, but not much else has changed.
+

+ Taken altogether, the differences don’t appear that substantial, but given that every device/browser has access to the narrow layout, the reading experience is vastly improved. Note: to get IE to apply media queries in versions 8 and under, we’re using the Scott Jehl’s Respond.js.

+

+ So that gives you a pretty good sense of how we’re adjusting the layout based on the device size, but there are a few other niceities going on under the hood that I’d like to share as well. Here’s a round-up:

+
    +
  1. +Our content images are now being served via src.sencha.io, a free web service and CDN from the folks behind Sencha that takes the pain out of serving images based on the device requesting them. To keep the implementation simple (and easily swappable), I wrote an ExpressionEngine plugin to automatically swap images for their src.sencha.io equivalents (EE1 only for right now, but I’ll port it shortly). For more detail on using src.sencha.io, check out this article.
  2. +
  3. +Comments are now loaded via Ajax. I know, it sounds crazy, but it makes sense. By default, we include a link to an alternate version of the blog post template with comments exposed (well, really it’s the same template with an additional URL segment passed in). Then, using JavaScript, we look for that link and replace it with the comments thread after the page finishes loading. You know, progressive enhancement. The overall effect is that it reduces the time it takes to download the page, which means you get to the content you want to read faster. You can check out the code that makes it work over on Github (it’s a slightly modified version of Scott Jehl’s original script).
  4. +
  5. +Our social plugins now protect your privacy. We hadn’t really thought about the fact that every time a script is included from Twitter or Facebook, that can be used to track your movement around the web. Once we realized it though, we decided we needed to change our social links to protect you. As such, we’ve changed the links to work without JavaScript and using images delivered by servers we control. Google+ has been dropped for the time being as they do not seem to offer a consistent destination URL for their +1 service. If they change that or someone can tell us what it is, we’ll likely bring it back.
  6. +
  7. +Better support for services like Readability. We want you to be comfortable reading our content; if you don’t like our layout, that’s fine with us.
  8. +
+

+ What do you think? Did we miss anything that would make your reding experience better on this blog?

+

+PS - If you’re interested, I’ll be giving a full-day Adaptive Web Design workshop in Amsterdam on 29 November 2011, during which I will be discussing these topics and more in much greater depth and mentoring attendees on how to craft truly rich web experiences with progressive enhancement. Tickets are available on EventBrite.

diff --git a/export/2011-10-25-progressive-enhancement-and-expressionengine.md b/export/2011-10-25-progressive-enhancement-and-expressionengine.md new file mode 100644 index 0000000..f4d2707 --- /dev/null +++ b/export/2011-10-25-progressive-enhancement-and-expressionengine.md @@ -0,0 +1,38 @@ +--- +title: "Progressive Enhancement and ExpressionEngine" +date: 2011-10-25 13:17:00 +comments: false +tags: + - "(x)HTML" + - "accessibility" + - "browsers" + - "business" + - "coding" + - "conferences" + - "content management" + - "CSS" + - "design" + - "ExpressionEngine" + - "JavaScript" + - "mobile" + - "optimization & performance" + - "presentations" + - "progressive enhancement" + - "usability" + - "web standards" +description: "This past week was a bit of a whirlwind as Kelly and I flew to DC for a few meetings and then to NYC for the ExpressionEngine CodeIgniter Conference . We had a blast at the conference, meeting new people, seeing old friends, and eating..." +permalink: /archives/progressive-enhancement-and-expressionengine/ +--- + +

+ This past week was a bit of a whirlwind as Kelly and I flew to DC for a few meetings and then to NYC for the ExpressionEngine CodeIgniter Conference. We had a blast at the conference, meeting new people, seeing old friends, and eating a ton of great food. While we were there, we got name-dropped by the affable EllisLab CEO, Leslie Comacho in his keynote address for our work on the forthcoming native rich text editor for ExpressionEngine and we also met up with the always charming Jeremy Keith (and the lovely Jessica Spengler) to hatch some R4G plans that will be unveiled in the next week or so.

+

+ The best part for me, though, was getting to talk about progressive enhancement. The conference organizer, Robert Eerhart, gave me an hour to advocate on behalf of users and I made the most of that time by delving into the ways and means of progressive enhancement and how it can be accomplished in ExpessionEngine (and, to a lesser extent, CodeIgniter). I got tons of great questions from the audience and had many lively discussions after stepping off the stage as well. It was really nice to see so many people getting excited about a topic about which I’m incredibly passionate.

+

+ I’ve posted my slides to SlideShare. If you were able to make the presentation, these will hopefully help jog your memory when it comes to the techniques I discussed, but if you couldn’t make the show, they’re also up for your enjoyment.

+
+
+

+ I’d like to thank everyone who made EECI2011 so amazing and I’d like to give a special thank you to everyone who picked up a copy of my book while we were there. And if you missed out on getting a copy, you can pick one up over on the Easy Readers site. I’ll be happy to sign it for you if you reply to the receipt email and let the team know you missed out on picking up a copy at the event.

+

+ Thanks again and I look forward to seeing you all again next year!

diff --git a/export/2011-11-08-slides-from-fowd-nyc-2011.md b/export/2011-11-08-slides-from-fowd-nyc-2011.md new file mode 100644 index 0000000..76dd7ab --- /dev/null +++ b/export/2011-11-08-slides-from-fowd-nyc-2011.md @@ -0,0 +1,21 @@ +--- +title: "Slides from FoWD NYC 2011" +date: 2011-11-08 03:23:52 +comments: true +tags: + - "(x)HTML" + - "accessibility" + - "browsers" + - "coding" + - "conferences" + - "optimization & performance" + - "presentations" + - "progressive enhancement" + - "web standards" +description: "Below you’ll find the slides from the talk I gave earlier today at Future of Web Design titled “ HTML 5: Smart Markup for Smarter Websites .” It’s been a long day, so that’s it for now. I’ll post my notes from some of the talks I caught..." +permalink: /archives/slides-from-fowd-nyc-2011/ +--- + +

Below you’ll find the slides from the talk I gave earlier today at Future of Web Design titled “HTML5: Smart Markup for Smarter Websites.” It’s been a long day, so that’s it for now. I’ll post my notes from some of the talks I caught in a day or so (once I’ve finished prepping for Wednesday’s workshop).

+
+

Enjoy!

diff --git a/export/2011-11-16-on-adaptive-vs-responsive-web-design.md b/export/2011-11-16-on-adaptive-vs-responsive-web-design.md new file mode 100644 index 0000000..eef72dd --- /dev/null +++ b/export/2011-11-16-on-adaptive-vs-responsive-web-design.md @@ -0,0 +1,25 @@ +--- +title: "On Adaptive vs. Responsive Web Design" +date: 2011-11-16 17:23:00 +comments: true +tags: + - "(x)HTML" + - "accessibility" + - "coding" + - "CSS" + - "design" + - "JavaScript" + - "mobile" + - "progressive enhancement" + - "usability" + - "web standards" +description: "In the past few months, I’ve spent an inordinate amount of time discussing the differences between the “adaptive” and “responsive” web design philosophies. Don’t get me wrong, I love having these discussions, but I felt the need to set..." +permalink: /archives/on-adaptive-vs-responsive-web-design/ +--- + +

+ In the past few months, I’ve spent an inordinate amount of time discussing the differences between the “adaptive” and “responsive” web design philosophies. Don’t get me wrong, I love having these discussions, but I felt the need to set the record straight: these two philosophies are not at odds, despite numerous blog posts and tweets to the contrary.

+

+Responsive web design,” as coined by Ethan Marcotte, means “fluid grids, fluid images/media & media queries.” “Adaptive web design,” as I use it, is about creating interfaces that adapt to the user’s capabilities (in terms of both form and function). To me, “adaptive web design” is just another term for “progressive enhancement” of which responsive web design can (an often should) be an integral part, but is a more holistic approach to web design in that it also takes into account varying levels of markup, CSS, JavaScript and assistive technology support.

+

+ For the record, I do think it’s important to draw a distinction between “adaptive web design” and “adaptive layouts” because “adaptive layouts” implies only the use of media queries, which may not be done in a progressively enhanced way. Adaptive layouts achieved in a mobile-first manner, however, are very likely progressive enhancement and, thereby, a means of “adaptive web design.

diff --git a/export/2011-12-12-crafting-rich-experiences-with-progressive-enhancement-at-beyond-telle.md b/export/2011-12-12-crafting-rich-experiences-with-progressive-enhancement-at-beyond-telle.md new file mode 100644 index 0000000..5a717c5 --- /dev/null +++ b/export/2011-12-12-crafting-rich-experiences-with-progressive-enhancement-at-beyond-telle.md @@ -0,0 +1,29 @@ +--- +title: "Crafting Rich Experiences with Progressive Enhancement at Beyond Tellerrand" +date: 2011-12-12 21:05:00 +comments: true +tags: + - "(x)HTML" + - "accessibility" + - "browsers" + - "coding" + - "conferences" + - "CSS" + - "design" + - "JavaScript" + - "presentations" + - "progressive enhancement" + - "usability" + - "web standards" +description: "After a whirlwind trip to 4 countries (5 if you count Florida), I am back to a rock-solid internet connection and got a moment to take a breath and post my slides from the first stop on the trip: Beyond Tellerrand in Düsseldorf, Germany." +permalink: /archives/crafting-rich-experiences-with-progressive-enhancement-at-beyond-telle/ +--- + +

+ After a whirlwind trip to 4 countries (5 if you count Florida), I am back to a rock-solid internet connection and got a moment to take a breath and post my slides from the first stop on the trip: Beyond Tellerrand in Düsseldorf, Germany.

+

+ This talk is an update of the short session I gave at WebVisions in Portland earlier this year. It covers all the topics I address in my book and more:

+
+
+

+ It was an awesome trip and I met a bunch of great people there as well as at my Adaptive Web Design workshops in Amsterdam and Reykjavik. I took a ton of photos (many of which are slowly finding their way onto Instagram and Flickr) and also enjoyed spending three weeks on the road with my loving wife and partner, Kelly. It was the perfect way to spend our 10-year anniversary. Now it’s back to the grindstone so we can bang out a few more projects before the new year. W00t!

diff --git a/export/2011-12-15-an-end-to-aging-ie-installs.md b/export/2011-12-15-an-end-to-aging-ie-installs.md new file mode 100644 index 0000000..e3eecda --- /dev/null +++ b/export/2011-12-15-an-end-to-aging-ie-installs.md @@ -0,0 +1,15 @@ +--- +title: "An End to Aging IE Installs" +date: 2011-12-15 19:34:06 +comments: false +tags: + - "browsers" + - "web standards" +description: "Today is a momentous day." +permalink: /archives/an-end-to-aging-ie-installs/ +--- + +

Today is a momentous day.

+

After spending years of griping about IE6’s staying power and lamenting Microsoft’s earlier decision to advocate against upgrading to IE7 (a decision they didn’t stick with, thankfully), Microsoft has turned a new leaf today, announcing that they will be pushing updates to IE to anyone who takes part in their Windows Update service.

+

What does this mean? Well, it means that grandma will be upgraded to IE8 if she’s still on Windows XP or IE9 if she’s on Vista or Windows 7.

+

Corporations (and individuals) still have the ability to opt-out of these updates, but this move should put an end to upgrades that haven’t happened purely because users didn’t know how to upgrade to a new version of IE. As Microsoft’s own Peter Laudati so eloquently put it, Upgrade Your Parents Browser Weekend” is now officially obsolete.

diff --git a/export/2011-12-16-javascript-less-google-finally.md b/export/2011-12-16-javascript-less-google-finally.md new file mode 100644 index 0000000..d5a0433 --- /dev/null +++ b/export/2011-12-16-javascript-less-google-finally.md @@ -0,0 +1,32 @@ +--- +title: "JavaScript-less Google+ (finally)" +date: 2011-12-16 16:03:00 +comments: false +tags: + - "business" + - "coding" + - "progressive enhancement" + - "social networks" + - "usability" +description: "When we launched the mobile-first version of this blog, we opted not to include Google+ as one of the sharing options because there was no way to make it work without JavaScript (a fact which undermined both our progressive enhancement..." +permalink: /archives/javascript-less-google-finally/ +--- + +

+ When we launched the mobile-first version of this blog, we opted not to include Google+ as one of the sharing options because there was no way to make it work without JavaScript (a fact which undermined both our progressive enhancement philosophy and the privacy of our readers). I tried digging into the (IMHO) over-engineered code that manages the +1 button to find an end point, but after about an hour of digging decided it wasn’t worth it. Thankfully there are others out there who are more persistent than I am and a way to share on Google+ without using the Google-supplied JavaScript is now available thanks to the folks at TechLifeWeb.

+

+ It started as a bookmarklet, but the endpoint URL for the mobile share form is easily extracted from there (note: this works, ok but is not ideal… see Update #2 below):

+
+

+ https://m.google.com/app/plus/x/?content=CONTENT+AND+URL+GOES+HERE&v=compose&hideloc=1

+
+

+ We’ve gone ahead and implemented Google+ now, so if you are a fan… happy linking!

+

+Update #1: In playing around with it a bit more, there’s a strange behavior whereby Google says there’s a problem with the post, but it actually does get into your Stream. I’ll dig around a bit and see if I can sort that out.

+

+Update #2: Google is finally supporting this properly. Here’s the URL scheme:

+
+

+ https://plus.google.com/share?url=URL+GOES+HERE

+
diff --git a/export/2012-01-09-progressive-enhancement-vs.-hardboiled-design.md b/export/2012-01-09-progressive-enhancement-vs.-hardboiled-design.md new file mode 100644 index 0000000..2a8d3a2 --- /dev/null +++ b/export/2012-01-09-progressive-enhancement-vs.-hardboiled-design.md @@ -0,0 +1,87 @@ +--- +title: "Progressive Enhancement vs. Hardboiled Design" +date: 2012-01-09 15:27:00 +comments: true +tags: + - "accessibility" + - "design" + - "progressive enhancement" + - "usability" +description: "Late last week, I linked my Forrst followers to Stephanie Rieger’s awesome post “A Plea for Progressive Enhancement ” which offered a even-handed critique of a sliding menu interaction on the website for the Obama campaign. The main..." +permalink: /archives/progressive-enhancement-vs.-hardboiled-design/ +--- + +

+ Late last week, I linked my Forrst followers to Stephanie Rieger’s awesome post “A Plea for Progressive Enhancement which offered a even-handed critique of a sliding menu interaction on the website for the Obama campaign. The main thrust of her complaint was that it didn’t work on most of the mobile devices she tested, including an iPhone 4 running iOS 4.3.5—one version prior to the release of iOS 5.

+
+

+ [T]he menu failed. Never even opened. Suddenly, the site was without navigation…at all.

+
+

+Joe Seddon, a UK-based designer, shared his reaction to the post in the comment thread, but if you’re not a Forrst member, you can’t read the comments, so I wanted to share his reaction:

+
+

+ I know you’re a big fan of progressive advancement Aaron and I have huge respect for you as a designer, however I disagree with your way of thinking and feel it is holding our industry back.

+

+ Starting from the bottom instead of the top limits creativity. By designing from the top we as designers can take advantage of new technology and build the best user experience possible for those who use the best browsers. I agree that designs should work to a usable degree on every browser and device in which there is a decent level of traffic coming from, however this doesn’t mean we should have to start designing for them first.

+

+ In the case of Brad Frost, he should keep his nifty slider on Barack Obama’s website however on mobile he should find an alternative solution that works. If this means removing the slider all together and replacing it with a simpler navigation method then so be it. He shouldn’t limit the experience of the desktop user just because of the mobile user doesn't have a device that supports this or that.

+
+

+ I don’t mean to pick on Joe here, but he shares a common misconception about progressive enhancement. One I hope my response (below) dispels:

+
+

+@JoeSeddon It sounds like you’re firmly in the Andy Clarke camp on this one, but I couldn’t disagree more with your statement that my “way of thinking” (i.e. progressive enhancement) is “holding our industry back.” If anything, I think it is the way forward. And for the record, I’m not the only one thinking this way: Jeffrey Zeldman, @adactio, Ethan Marcotte, Daniel Mall, Scott Jehl & the Filament Group, Brad Frost, Stephanie and @bryanrieger, and countless others support and promote progressive enhancement every day.

+
+

+ Starting from the bottom instead of the top limits creativity.

+
+

+ Actually no. Building a website is a heck of a lot like building a house—you need a solid foundation and “good bones” for it so stand the test of time and for it to be able to support the amazing things you want to do with it. Your server forms the foundation—keeping the whole website stable. And smart, semantic markup is the framing—the joists and supports that allow you to build higher without worrying about collapse.

+

+ To take the analogy further, your backend (assuming you have an API or at least a DB and some code to talk to it) is like the electrical, water, and communication systems which will support the fixtures of your site. CSS is your façade and interior design. Basic HTTP (e.g. links and communication via POST and GET) and JavaScript (probably in concert with an API) connects the systems to your fixtures (most likely a combo of HTML, CSS & JS) and makes them functional.

+

+ All of these pieces are orchestrated by your IA, User Flows, and UX design—the blueprints, elevations, etc. of the web world. And, to be honest, that’s where you should be doing the lion’s share of your creative thinking when it comes to interface.

+
+

+ By designing from the top we as designers can take advantage of new technology and build the best user experience possible for those who use the best browsers.

+
+

+ You, as a designer, should be considering the implications of technical decisions and options at the planning stage. If you’re a freelancer or run a small shop, you may be the UX person too, but if you aren’t, you should be working with your UX person to propose innovative interactions and then plan out how those can be used on the latest and greatest browsers and what the experience would be on less capable browsers and devices. It all starts with the planning.

+

+ Nothing in progressive enhancement says you can’t use the latest and greatest technologies and techniques, it just asks you to respect your content and your users by being smart about how you apply them. Remember: browsers and technologies come and go1; focus on your content and your users.

+
+

+ I agree that designs should work to a usable degree on every browser and device in which there is a decent level of traffic coming from, however this doesn't mean we should have to start designing for them first.

+
+

+ First of all, analytics are not always 100% accurate and, secondly, as a web designer or developer, we never know who is coming to our site and what they are looking to do. For all you know, there’s a lady out there looking to spend millions of dollars on the product or service your site (or your client’s) is providing and your analytics program can’t tell you that she’s the 0.001% that came to your site on an aging Blackberry. Analytics can tell you general trends, but they should only be used for general guidance. I’d rather build something that is going to work for a user regardless of her device. I’m not going to waste time trying to re-create the awesome experience she may have in the latest version of Chrome or Firefox, but I sure as hell want to make sure the experience she does have is a positive one.

+
+

+ In the case of Brad Frost, he should keep his nifty slider on Barack Obama's website however on mobile he should find an alternative solution that works.

+
+

+ Point of clarification: Brad does not work for the Obama campaign, he simply brought Stephanie’s attention to the interface, but to your point: “he should find an alternative solution that works.” Absolutely! Building from a workable baseline up to the hi-fi experience of the sliding nav would accomplish that. There’s nothing to say that you can’t have your cake and eat it too; you just need to be smart about your approach—proper planning is key.

+
+

+ He shouldn't limit the experience of the desktop user just because of the mobile user doesn't have a device that supports this or that.

+
+

+ Of course he shouldn’t. Progressive enhancement doesn’t say that he should.

+

+ I think you should rethink what progressive enhancement is all about. Not to plug my own work, but the first chapter of my book lays it out pretty well. You can download it for free as a PDF or read the web-based version on .net Magazine .

+

+ 1. Don’t believe me? Look at how many companies built software and intranets around IE6. Why did they do it? It was considered a pretty good browser at the time. Need a more recent example? Look at WebDB (SQLite). It was introduced in Webkit and and looked to be on track to become a formal W3C recommendation, but then it was dropped in favor of IndexedDB. I speak from experience when I say things like this can and often do bite you in the ass if you work on the bleeding edge.

+
+

+ After reading my incredibly lengthy response, Joe kindly wrote back:

+
+

+@AaronGustafson First of all I’d just like to say great post, and thanks for taking the time to reply to my post.

+

+ Your reply has actually made me think about progressive enhancement and “hardboiled design” and re-consider which one really is the best strategy. I like your analogy of building a house in particular and that’s what mainly made me re-think my stance. My biggest problem with progressive enhancement was building from the bottom, as I truly did believe building from the top would allow me to deliver a better experience to those who use the better browsers/devices. In the words of Andy Clarke, I didn’t want to just give users who are on the latest version of Google Chrome little visual rewards.

+

+ Thanks for linking me to the first chapter of your book, I’ve heard a lot of positive things about it and it certainly has gone down well with its readers and the media. I’ll read the first chapter and see where I stand after it.

+
+

+ I’m happy to have gotten him to reconsider his stance on progressive enhancement. Hopefully we’ve gained another convert. Time will tell. ;-)

diff --git a/export/2012-02-03-html5-is-the-new-dhtml.md b/export/2012-02-03-html5-is-the-new-dhtml.md new file mode 100644 index 0000000..c517205 --- /dev/null +++ b/export/2012-02-03-html5-is-the-new-dhtml.md @@ -0,0 +1,29 @@ +--- +title: "HTML5 is the new DHTML" +date: 2012-02-03 13:01:23 +comments: true +tags: + - "(x)HTML" + - "business" + - "client relations" + - "culture & society" + - "web standards" +description: "In a recent post, Adrian Roselli ranted a bit about the awkward position we are in with regard to HTML 5 . Here’s a taste:" +permalink: /archives/html5-is-the-new-dhtml/ +--- + +

In a recent post, Adrian Roselli ranted a bit about the awkward position we are in with regard to HTML5. Here’s a taste:

+
+

The trend continues where I speak to clients, vendors, young developers fresh out of college, and even the teachers/professors who instruct them and they don’t understand that HTML5 and CSS3 aren’t the same specification. I have repeatedly shown an HTML 4.01 site with CSS3 to explain that they are each distinct specifications which can be applied in different combinations of different versions. This is further complicated when JavaScript is folded into the mix—some folks even think jQuery is part of the HTML5 specification.

+
+

It’s true: For all intents and purposes, “HTML5” has become a meaningless catch-all marketing phrase defining a platform rather than a specification. It’s “DHTML” all over again.1

+

This all probably started with the fact that “HTML5,” as a spec, was always more than a markup language. Even from the very early days at the WHATWG (before they decided to go versionless and just call it “HTML”), “HTML5” was a markup language, an updated DOM interface, and a set of new APIs for interacting with browsers and devices. A few of us took issue with classifying it all as “HTML5,” but it’s not like anyone can tell Hixie what to do.

+

So yeah, from the beginning “HTML5” has been a bit of a misnomer, but the final blow to HTML5’s usefulness as a term—to me at least—came in the form of Apple’s “HTML5 Showcase.” It received a ton of attention in the press and really got the term “HTML5” out there… oddly enough while mostly demoing CSS3 features and making little to no attempt to disambiguate the technologies at work.

+

Ok, so what’s the problem with all of this? Some argue that there is no problem, that the public’s enthusiasm for “HTML5” can only bring about positive change on the web. I don’t disagree with that, but I also strongly believe semantics are important. Chris Mills summed up my feelings pretty well on the WaSP blog around this time last year:

+
+

This really isn’t good—I appreciate that it is good to have an umbrella term for a group of related technologies and techniques that would otherwise be difficult to talk about in conversation. “Ajax” and “Web 2.0” serve that purpose well. And it is ok to talk about closely-related specs such as Geolocation and Web Sockets as being under the HTML5 umbrella, as long as you clarify it somewhere (you can find a good example in Get familiar with HTML5!). But this is different—HTML5 and CSS3, for example, are two distinctly different technologies, and should not be confused with one another. To do so will impede learning and cause problems with development, documentation, and all manner of other things.

+
+

That’s the rub. When engaging in conversations, we need too know which “HTML5” is being discussed. Personally, when I discuss HTML5, I always draw a distinction between “HTML5” the marketing term and “HTML5” the specification even if asked ambiguous questions about HTML5. I’m not saying everyone needs to know the precise differences between the two uses of the term, but it’s our duty to educate them that there is a difference, even if they can’t fathom the particulars.

+
    +
  1. DHTML, as you may recall, was a catch-all phrase that meant using HTML, CSS, and JavaScript together, but some people thought it was an actual technology in its own right. Thanks marketing wonks!
  2. +
diff --git a/export/2012-02-09-this-must-not-happen.md b/export/2012-02-09-this-must-not-happen.md new file mode 100644 index 0000000..31bb351 --- /dev/null +++ b/export/2012-02-09-this-must-not-happen.md @@ -0,0 +1,35 @@ +--- +title: "This Must Not Happen!" +date: 2012-02-09 11:56:54 +comments: true +tags: + - "browsers" + - "coding" + - "CSS" + - "design" + - "progressive enhancement" + - "web standards" +description: "When I opened my inbox this morning, I nearly fell over. According to Daniel Glazman , co-chair of the CSS Working Group at the W3C , browser makers are considering supporting the WebKit vendor prefix ( -webkit-* ) because the web..." +permalink: /archives/this-must-not-happen/ +--- + +

When I opened my inbox this morning, I nearly fell over. According to Daniel Glazman, co-chair of the CSS Working Group at the W3C, browser makers are considering supporting the WebKit vendor prefix (-webkit-*) because the web development community can’t be bothered to use the equivalent experimental properties for other browsers:

+
+

WebKit, the rendering engine at the heart of Safari and Chrome, living in iPhones, iPads and Android devices, is now the over-dominant browser on the mobile Web and technically, the mobile Web is full of works-only-in-WebKit web sites while other browsers and their users are crying. Many sites are sniffing the browser’s User-Agent string and filtering out non-WebKit browsers. As in the past with IE6, it’s not a question of innovation but a question of hardware market dominance and software bundled with hardware. But there is an aspect of the problem we did not have during the IE6 era: these web sites are also WebKit-specific because they use only “experimental” CSS properties prefixed with -webkit-* and not their Mozilla, Microsoft or Opera counterparts. So even if the browser sniffing goes away, web sites will remain broken for non-WebKit browsers…

+

In many if not most cases, the -webkit-* properties WebKit-specific web sites are using do have -moz-*, -ms-*, -o-* equivalents. Gradients, Transforms, Transitions, Animations, border-radius, all interoperable enough to be browser-agnostic. Their web authors need only a few minutes to make the site compatible with Mozilla, Microsoft or Opera. But they never did it.

+

Without your help, without a strong reaction, this can lead to one thing only and we’re dangerously not far from there: other browsers will start supporting/implementing themselves the -webkit-* prefix, turning one single implementation into a new world-wide standard. It will turn a market share into a de facto standard, a single implementation into a world-wide monopoly. Again. It will kill our standardization process. That’s not a question of if, that’s a question of when.

+
+

This idea has been floated in conversations for a few years, but this portion of Dan’s post represents an official discussion at the CSS Working Group. An official discussion that Adobe, Apple, Disruptive Innovations, Google, HP, Microsoft, Mozilla, Opera and the W3C were all participating in.

+

While it is true that writing them all out is tedious, vendor-specific prefixes serve a very valuable purpose: they allow a browser manufacturer to experiment with a property before it becomes an official part of the spec. And during that experimental phase, the syntax can (and often does) change. If you use vendor-specific prefixes, you do so at your own risk. That’s not to say you shouldn’t use them, but it is to say that you should be careful about when and how you use them.

+

The value of vendor-specific prefixes is not really in question here though; they are not the problem. We are. We are apparently too lazy to implement CSS in a consistent cross-browser fashion. WTF?!

+

Please, I beg you: Take 10 minutes out of your day today and update every site you can to use the other vendor-specific prefixes (and non-prefixed) versions of each -webkit-* property you find, even if you’re not sure it exists yet. And if you need help, ask.

+

UPDATE: If you want to scan your server for files that might need adjustment, try this from the command line:

+

find /var/www -type f -name "*.css" -exec grep -il "webkit" {} \;
+

If you want to run it locally on a Mac, you should change the folder to ~/Sites.

+

UPDATE #2: I created a petition and a pledge:

+
    +
  1. Tell Microsoft, Mozilla, and Opera not to implement the -webkit-* vendor prefix, and
  2. +
  3. Pledge to update every site you can to use the other vendor-specific prefixed (and non-prefixed) versions of each -webkit-* property you find, even if you’re not sure it exists yet
  4. +
+

Sign the petition “Don’t make -webkit- prefixes a de facto standard” on Change.org.

+

diff --git a/export/2012-03-14-egalitarianism-and-progressive-enhancement.md b/export/2012-03-14-egalitarianism-and-progressive-enhancement.md new file mode 100644 index 0000000..57d739f --- /dev/null +++ b/export/2012-03-14-egalitarianism-and-progressive-enhancement.md @@ -0,0 +1,77 @@ +--- +title: "Egalitarianism and Progressive Enhancement" +date: 2012-03-14 13:09:00 +comments: true +tags: + - "accessibility" + - "culture & society" + - "design" + - "progressive enhancement" + - "usability" + - "web standards" +description: "In 1971, John Rawls published A Theory of Justice , in which he described the following Thought Experiment he often conducted with students and other groups: The members of the Group were asked to design a society down to the very..." +permalink: /archives/egalitarianism-and-progressive-enhancement/ +--- + +

+ In 1971, John Rawls published A Theory of Justice, in which he described the following Thought Experiment he often conducted with students and other groups: The members of the Group were asked to design a society down to the very ethical principles that would guide the relationships of people within that society. They were given free reign and could create whatever kind of society they wanted—monarchy, anarchy, capitalist, communist—it was all up to them. The only stipulation Rawls placed on the experiment (and notified participants of) was that Group members were not allowed to know anything about who they would be as part of that society.

+

+ This twist to Rawl's experiment can be attributed to the “Veil of Ingorance” Theory conceived by John Harsanyi (a father of game theory), and it had a profound impact on how Groups chose to organize their hypothetical societies. 

+

+ What Rawls discovered through these experiments is that when the Veil of Ignorance is in play, people gravitate toward the deepest and broadest forms of egalitarianism in order to ensure that even the least well-off or marginalized people are treatly justly. In essence, it forces people to “walk a mile” in someone else’s shoes. After all, as a self-interested, rational human being, who would want to create a society that treats the elderly like crap if he might turn out to be elderly in that society?

+

+ The Veil of Ignorance is something we have to deal with in web design as well: As much as we may try to understand trends in our users—the browsers they use, the devices they are on, etc.—we can never know the full story. For instance, we may know that someone is coming to us on an iPhone, but we can’t (at least at this point) know whether they are using assistive technology like VoiceOver or even a braille touch feedback device. This is why concepts such as Usability and, moreover, Accessibility are so important. It’s also why progressive enhancement is my guiding philosophy—the use of progressive enhancement in web design is what egalitarianism is in society. 

+

+ Of course, whenever you start talking about egalitarianism, you attract the haters (and haters are going to hate). Here’s Gary Hull of the Ayn Rand Institute:

+
+

+ Egalitarianism, which claims only to want an 'equality' in end results, hates the exceptional man who, through his own mental effort, achieves that which others cannot…. Talent and ability create inequality…. To rectify this supposed injustice, we are told to sacrifice the able to the unable. Egalitarianism demands the punishment and envy of anyone who is better than someone else at anything.

+
+

+ Sounds a lot like the hardboiled/graceful degradation camp right? To paraphrase: Progressive enhancement is holding us back by requiring us to give all users a dumbed-down experience. Wrong! Like egalitarianism’s critics, many of progressive enhancement’s critics fail to grasp the meaning of “equality” used by egalitarians (instead using a definition more akin to that used by socialist and communist philosophies). To quote Alexander Berkman (emphasis mine):

+
+

+ [E]quality does not mean an equal amount but equal opportunity…. Do not make the mistake of identifying equality in liberty with the forced equality of the convict camp. … It does not mean that every one must eat, drink, or wear the same things, do the same work, or live in the same manner. Far from it: the very reverse in fact…. Individual needs and tastes differ, as appetites differ. It is equal opportunity to satisfy them that constitutes true equality… Far from levelling, such equality opens the door for the greatest possible variety of activity and development.

+
+

+ Were Berkman a web designer (rather than an early 20th century anarchist) he would probably fall down on the side of progressive enhancement as his statement echoes egalitarian aims perfectly: access to content and functionality without technological restriction. Progressive enhancement does not aim to give the same experience to every person on every device in every browser; that would be ludicrous. It simply asks that you honor your users (and your content) by giving them a positive experience irrespective of the their capabilities or that of their technology.

+
+

+ In my life, I’ve always been drawn to egalitarianism; I credit my grandparents for that. From a very young age, my grandparents encouraged me to follow the Golden Rule: Do unto others as you would have them do unto you. It's a simple maxim, but like egalitarianism it asks that you put yourself in someone else’s place and consider how your choices affect them. Would you want to be treated the way you treat others?

+

+ Surprisingly enough, even the Golden Rule has its critics. Whereas I see the Golden Rule as a positive motivator, some folks look at it and see it as a play to our inherent self-interest (e.g. selfishness). In other words, they argue that the message of the Golden Rule is more about you than it is about the “others.” I don’t really want to get into the whole is-a-truly-selfless-act-really-possible debate (Friends already nailed it anyway) because I think that line of thinking misses the point. To me, the point is simply consideration of the “other” in how you conduct yourself. To look outside yourself and your realm of knowledge and experience and actually empathize with another human being.

+

+ Progressive enhancement follows the Golden Rule because it is concerned with the “other”. That’s why accessibility is such a key part of building websites following the progressive enhancement philosophy. It’s about putting yourself in someone else’s shoes—someone whose abilities and situation probably differ from yours. We are a diverse lot after all.

+

+ Of course whenever I bring up the importance of acessibility, I get reactions like this: My business is selling TVs. Blind people don’t buy TVs so why should I cater to them?

+

+ Really? I know a lot of blind people with TVs. Sure, they may not be able to see it themselves, but their spouses, children, and friends likely can. And they can listen to it.

+

+ Back at my old ad agency, I had an email run-in with a department head over the National Federation for the Blind’s lawsuit againt Target. The NFB was suing Target because the company refused to address issues with the accessibility of its website that prevented blind users (among others) from being able to shop there. So I passed around a link to Derek Featherstone’s post on the subject as suggested reading. The reaction I got from the department head was that of your typical free-market libertarian:

+
+

+ Is Target forcing blind people to shop there? If they don’t, does Target hurt them in some way?

+

+ If it doesn’t meet web standards, why don’t blind people just shop somewhere else?

+
+

+ These are fair points and are the very arguments we often hear against equality legislation like the Americans with Disabilities Act. Ignoring the legal requirements and altruistic motivations behind doing something to provide equal opportunity—and ignoring the fact that in many cases the government will give you tax credits for making your business more accessible—let’s consider the business benefits of being more accessible.

+

+ To return to the TV store analogy, for all we know, a potential customer—who just happens to be blind (or even just vision-impaired)—might be on the hunt for an awesome home theater system that would be a huge sale for whoever gets her business. If she can’t easily navigate our site to find what she’s looking for—or access our physical storefront—do you think she’s going to stick around and struggle through a frustrating (or potentially humiliating) experience just to give you her money? No way, she’s going to make her purchase from somewhere that is more accomodating, that gives her equal opportunity to make a purchase by respecting her needs. So beyond doing the “right thing,” it’s in our self-interest to be as respectful as possible of our customers and potential customers—that’s good customer service.

+

+ Progressive enhancement considers customer service (a.k.a. user experience) at every level of an interface because it instructs us to provide equal opportunities to access content and functionality.

+
+

+ Back in January, Ben Hoh demonstrated his complete understanding of the progressive enhancement philosophy:

+
+

+ [Progressive enhancement] keeps the design open to possibilities of sexiness in opportune contexts, rather than starting with a “whole” experience that must be compromised. While it might simply seem like another way to achieve graceful degradation’s exact goal from the opposite direction, this newer approach is qualitatively different: because progressive enhancement doesn’t presume a single, ideal state to fall back from, it deals much better with emerging landscapes and multiple contexts. For example, developing an integrated design that provides an equally “full” and contextually appropriate experience for both mobile and desktop browsers is easier with progressive enhancement.

+
+

+ What a great way to put it. Eloquent, to say the least.

+

+ Interestingly, the intent of Ben’s post was not to sell people on the benefits of the progressive enhancement approach to web design but rather to ponder the question: what might progressive enhancement suggest in the world of culture and politics? It’s a subject I have been mulling over in my head for years and I thank him for finally coaxing it out of me.

+

+ Many people say it’s impolite to discuss politics (or religion), but I live for these discussions. Discussing either topic gives you so much insight into the what makes a person tick, and I love getting to know people. And despite having never formally studied it, I just love philosophy and believe that my personal philosophy (which is largely shared by the team here at Easy) greatly informs the work that we do. I hope sharing it leads to some interesting discussions both here in the comments and (just maybe) out in the real world when we run into each other—be it at conferences or the coffeshop.

+

+P.S. - To see other perspectives on progressive enhancement and politics, I highly recommend reading Ben Hoh’s post and Barry Saunders’ follow-up.

diff --git a/export/2012-04-16-iir-redux.md b/export/2012-04-16-iir-redux.md new file mode 100644 index 0000000..62691f9 --- /dev/null +++ b/export/2012-04-16-iir-redux.md @@ -0,0 +1,35 @@ +--- +title: "iIR Redux" +date: 2012-04-16 13:02:00 +comments: true +tags: + - "CSS" + - "design" + - "iOS" + - "mobile" + - "optimization & performance" + - "progressive enhancement" + - "web standards" +description: "A few years back, I wrote a little article celebrating the fact that you could actually apply image-replacement techniques to images themselves . At the time, I was using it mainly for converting black and white printer-friendly logos..." +permalink: /archives/iir-redux/ +--- + +

+ A few years back, I wrote a little article celebrating the fact that you could actually apply image-replacement techniques to images themselves. At the time, I was using it mainly for converting black and white printer-friendly logos to colorized or reversed alpha-transparent PNGs, but I postulated that the technique could also easily be used to replace high resolution print-friendly imagery with web-ready graphics (impracticle as that may be). Little did I know, six years later we’d see the advent of ultra-high resolution handheld devices like the iPhone 4 and iPad 3. Hell, at the time I was still rocking a Treo.

+

+ In a few recent projects, we’ve been offering high resolution graphics—mainly logos, icons and the like only for now as we’re not looking to kill anyone’s mobile bandwidth allotment—to devices with a pixel density of two or more (@media screen and (min-pixel-density:2)). Interestingly, the same technique I came up with years ago is perfectly suited for this purpose in addition to letting you set a small, low resolution baseline image (mobile-first, ya know), replacing it with larger and/or higher resolution images as real estate or device supports it.

+

+ For those not keen to read my original article, here’s the jist (or gist—harharhar): take your favorite image-replacement technique and apply it to an img element. As a means of demonstrating the concept, here’s two images:

+

+

<p id="social-links">
<a href=""><img src="fb-print.png" alt="Like us on Facebook"/></a>
<a href=""><img src="twitter-print.png" alt="Follow us on Twitter"/></a>
</p>
view raw snippet.html hosted with ❤ by GitHub
+

+ With those in place, you simply apply your preferred image-replacement technique to the img elements. I’m partial to Leahy/Langridge:

+

+

#social-links img {
background: center center no-repeat;
display: inline-block;
height: 0;
width: 25px;
padding-top: 25px;
overflow: hidden;
}
#social-links img[src*=fb] {
background-image: url(fb-screen.png);
}
#social-links img[src*=twitter] {
background-image: url(twitter-screen.png);
}
view raw iir.css hosted with ❤ by GitHub
+

+ All that’s required to re-purpose this technique for high resolution images is to add in the background-size property (set to the dimensions of the original low density image, of course):

+

+

@media only screen and (-moz-min-device-pixel-ratio: 2),
only screen and (-ms-min-device-pixel-ratio: 2),
only screen and (-o-min-device-pixel-ratio: 2),
only screen and (-webkit-min-device-pixel-ratio: 2),
only screen and (min-device-pixel-ratio: 2)
{
#social-links img {
background: top left no-repeat;
background-size: 25px 25px;
display: inline-block;
height: 0;
width: 25px;
padding-top: 25px;
overflow: hidden;
}
#social-links img[src*=fb] {
background-image: url(fb-2x.png);
}
#social-links img[src*=twitter] {
background-image: url(twitter-2x.png);
}
}
view raw iir-retina.css hosted with ❤ by GitHub
+

+ I think it’s a pretty useful technique overall. Of course it’s only really useful for presentational images; it doesn’t really make sense for content images. And, of course, there’s still the question of whether it’s advisable to load a high-resolution image for someone who may be on a metered connection. If you are concerned about that sort of thing, take heart that the W3C is working on the issue. In the meantime, however, you may want to consider Foresight.js.

+

diff --git a/export/2012-04-24-dont-sell-out-your-users.md b/export/2012-04-24-dont-sell-out-your-users.md new file mode 100644 index 0000000..73797b7 --- /dev/null +++ b/export/2012-04-24-dont-sell-out-your-users.md @@ -0,0 +1,60 @@ +--- +title: "Don’t Sell Out Your Users" +date: 2012-04-24 13:17:00 +comments: true +tags: + - "(x)HTML" + - "coding" + - "culture & society" + - "progressive enhancement" + - "social networks" + - "usability" +description: "As a profession, we spend a lot of time thinking of the best ways to protect our users’ data and their privacy. In fact, most sites have exhaustive Privacy Policies detailing what information they collect and what they may do with it..." +permalink: /archives/dont-sell-out-your-users/ +--- + +

+ As a profession, we spend a lot of time thinking of the best ways to protect our users’ data and their privacy. In fact, most sites have exhaustive Privacy Policies detailing what information they collect and what they may do with it. That’s why I find it bizarre that many of these same sites have chosen to hand over their users’ browsing habits to third parties such as Twitter, Facebook, and Google without considering the implications.

+

+ You see, any time you add third party widget code—a “tweet this” button, for example—a request is made to that third party’s server in order to retrieve the required snippet of JavaScript. When that request is sent, information is sent to the server in the form of headers and (as is often the case) cookies. The headers usually contains general information about the browser being used, referring page, etc. and don’t generally pose much of a threat, but cookies are another matter altogether.

+

+ When a user visits a site—say, Facebook—she is typically cookied by that site. Browsers give a user some modicum of control over what sites can set a cookie, so she could have opted out of receiving Facebook’s cookie, but most users don’t know enough to or don’t know how to opt-out of cookies. Once that cookie is in place, it is directly associated with a specific domain (e.g. facebook.com or one of its hostnames). As long as that cookie is active, it remains on the users’ computer and accompanies any requests sent to the specifed domain. That means when this user visits a web page that includes a Facebook-supplied “Like” button, for example, Facebook can identify who she is (via the cookie) and what she’s looking at (via the request headers) without her even clicking the “Like” button. As she moves from page to page across the web, Facebook—or Google, or Twitter, or Pinterest, or AddThis, or any other service with decent distribution of its widgets—can effectively track her movement and build up a profile of her interests or browsing habits without her even knowing.

+

+ Now I’m pretty certain none of these companies started out wanting to track their users in such a manner, nor am I convinced many of them actively are (though I would not put it past Google or Facebook), but I think you can agree there is certainly potential for abuse here.

+

+ So what are we to do? Including buttons that easily allow our users to share content on their favorite social networks is extremely helpful for attracting more eyeballs to our content; it would be a shame to lose that opportunity. I agree, but just because a company offers a widget to make it simple for you to set up the button doesn’t mean you need to use it. Thankfully it isn’t really that difficult to host “share” buttons yourself, you just have to know how to do it.

+

+ Below is simplification of the current markup we use to achieve this in our blog. At present, we’ve chosen to support only four social networks: Twitter, Facebook, LinkedIn, and Google Plus.

+

+

<section id="bookmark">
<h2>Like it? Share it</h2>
<p class="twitter"><a href="https://twitter.com/intent/tweet?original_referer=THE-CURRENT-URL&amp;source=tweetbutton&amp;text=THE+TITLE+OF+THE+PAGE&amp;url=THE-CURRENT-URL&amp;via=OUR-TWITTER-ACCOUNT"><img src="/i/button-twitter.png" alt="Tweet"/></a></p>
<p class="facebook"><a href="http://www.facebook.com/sharer.php?u=THE-CURRENT-URL"><img src="/i/button-facebook.png" alt="Share on Facebook"/></a></p>
<p class="linkedin"><a href="https://www.linkedin.com/cws/share?url=THE-CURRENT-URL&amp;original_referer=THE-CURRENT-URL"><img src="/i/button-linkedin.png" alt="Share on LinkedIn"/></a></p>
<p class="google_plus"><a href="https://plus.google.com/share?url=THE-CURRENT-URL"><img src="/i/button-googleplus.png" alt="Share on Google Plus"/></a></p>
</section>
view raw snippet.html hosted with ❤ by GitHub
+

+ To trigger a share via Twitter, we use Twitter’s “tweet intent” URL: https://twitter.com/intent/tweet. We then supply several key-value pairs as part of the query string:

+
    +
  1. + The referer (our page) as original-referer;
  2. +
  3. + Any text we want included in the tweet—e.g. the title of the page—as text (n.b. be sure to replace spaces with “+”);
  4. +
  5. + the page to share as url; and
  6. +
  7. + (optionally) a Twitter account handle you’d like the tweet to appear “via”.
  8. +
+

+ Facebook seems less complicated, but it’s really not. Sure, the URL is simple—http://www.facebook.com/sharer.php with the URL supplied as u in the query string—but to control what Facebook displays when your page is shared, you need to add some meta tags that describe the page as an OpenGraph object. Here’s a sample from this blog post:

+

+

<meta property="og:site_name" content="The Easy Designs Blog"/>
<meta property="og:image" content="/i/facebook-icon.png?v=20111226"/>
<meta property="og:locale" content="en_US"/>
<meta property="fb:admins" content="aaronmgustafson"/>
<meta property="og:type" content="article"/>
<meta property="og:title" content="Don’t Sell Out Your Users"/>
<meta property="og:description" content="Most sites have exhaustive Privacy Policies detailing what information they collect and what they may do with it, which is why I find it bizarre that many of these same sites have chosen to hand over their users’ browsing habits to third parties such as Twitter, Facebook, and Google without considering the implications."/>
<meta property="og:url" content="http://blog.easy-designs.net/archives/dont-sell-out-your-users/"/>
view raw opengraph.html hosted with ❤ by GitHub
+

+ This set of meta tags establishes this post as an “article” with a title, description, and a canonical URL, which ensures it is displayed in Facebook properly. Facebook maintains pretty decent documentation on OpenGraph and the pieces they support and they also have a handy testing tool you can use to see if everything is making it to them properly. (It’s also worth noting that, using the debugger, you can also force Facebook to update any previously cached content from a given URL.)

+

+ Rounding out the pack are LinkedIn and Google Plus. Both use OpenGraph like Facebook does, but LinkedIn only supports a subset of OpenGraph tagsog:title, og:url, and og:image (though only if the image is wider than 150px and taller than 80px)—and Google Plus would prefer you use the ridiculously convoluted attributes defined by schema.org, but will fall back to OpenGraph or basic meta title and description tags if necessary.

+

+ Now that the HTML links are working properly (you did test them, right?), you need some buttons. There are tons of options out there, but if you like the ones we are using, you can download a layered PSD from us and tweak to your heart’s content. If, however, you want to go image-less, you could use one of the many great icon fonts out there (like IcoMoon). Whatever you do, just don’t link to images on a 3rd party site because then you are falling back into the same trap again because image requests also pass along headers and cookies.

+

+ And there you have it: a fully-funcitonal set of sharing tools that requires no JavaScript and doesn’t sacrifice your users’ privacy.

+

+ If you want to take it a step further, it’s pretty easy to add a tiny bit of JavaScript to make the links trigger a popup when there’s enough real estate (after all, you probably don’t want to do that on mobile). Here’s the jQuery code we currently use on this site for that purpose:

+

+

$('#bookmark').delegate('a','click',function(e){
if ( $(window).width() > 700 )
{
e.preventDefault();
window.open(this.href,'share-this','height=300,width=500,status=no,toolbar=no');
}
});
view raw popup.js hosted with ❤ by GitHub
+

+ As you can see, with just a little bit of effort (and maybe a bit of research), it’s easy to protect your users. Please consider doing it on your own sites.

+

diff --git a/export/2012-05-02-funkas-tillgaenglighetsdagar-2012.md b/export/2012-05-02-funkas-tillgaenglighetsdagar-2012.md new file mode 100644 index 0000000..3ff79c2 --- /dev/null +++ b/export/2012-05-02-funkas-tillgaenglighetsdagar-2012.md @@ -0,0 +1,26 @@ +--- +title: "Funkas Tillgänglighetsdagar 2012" +date: 2012-05-02 22:17:00 +comments: false +tags: + - "(x)HTML" + - "accessibility" + - "coding" + - "conferences" + - "design" + - "JavaScript" + - "mobile" + - "presentations" + - "progressive enhancement" + - "usability" + - "web standards" +description: "A few weeks back, I flew to Sweden to deliver a talk on progressive enhancement for mobile devices at Funkas Tillgänglighetsdagar , an accessibility conference whose name I will probably always butcher. I really enjoyed getting to know..." +permalink: /archives/funkas-tillgaenglighetsdagar-2012/ +--- + +

+ A few weeks back, I flew to Sweden to deliver a talk on progressive enhancement for mobile devices at Funkas Tillgänglighetsdagar, an accessibility conference whose name I will probably always butcher. I really enjoyed getting to know the Funka Nu team, meeting new people, and seeing how countries like Sweden, Norway, and Germany are addressing issues of accessibility in both public and private spheres. It was also nice to see validation for some of the thinking and work we’ve done around issues of accessibility.

+

+ Anyway, I thought I’d share my slide deck from the talk in case you’re interested. It was picked up yesterday and today as “Top Presentation of the Day” on SlideShare, so it’s either really useful or a slow time for uploads. Regardless, enjoy!

+
+
diff --git a/export/2012-08-06-implementing-responsive-design.md b/export/2012-08-06-implementing-responsive-design.md new file mode 100644 index 0000000..0bc3347 --- /dev/null +++ b/export/2012-08-06-implementing-responsive-design.md @@ -0,0 +1,38 @@ +--- +title: "Implementing Responsive Design" +date: 2012-08-06 16:15:00 +comments: true +tags: + - "books & articles" + - "mobile" + - "progressive enhancement" +description: "In case you hadn’t heard, Tim Kadlec fantastic book Implementing Responsive Design came out today from New Riders. It’s a fantastic and necessary read for any practicing web professional out there and I was honored Tim asked me to write..." +permalink: /archives/implementing-responsive-design/ +--- + +

+ In case you hadn’t heard, Tim Kadlec fantastic book Implementing Responsive Design came out today from New Riders. It’s a fantastic and necessary read for any practicing web professional out there and I was honored Tim asked me to write the foreword. With his permission, I have included it below:

+
+

+ A few years back, photography legend Chase Jarvis smartly observed that “the best camera is the one that’s with you.” It was a mildly shocking assertion at the time, but it rings true: the perfect shot is rarely planned. Rather, it sneaks up on you.

+

+ Perhaps the light is perfectly accentuating the fall foliage on your late afternoon stroll. Or perhaps your infant daughter just pulled herself up on two legs for the first time. In moments like these, it doesn’t matter that your Leica is sitting on a shelf in the other room or that you left your Rebel in the car—what matters is that you have a camera, however crude, in your pocket and can capture this serendipitous and ephemeral moment.

+

+ Riffing on Jarvis’s idea, Stephanie Rieger has made the case that the best browser is the one you have with you. After all, life is unpredictable. Opportunities are fleeting. Inspiration strikes fast and hard.

+

+ Imagine yourself as a cancer researcher. You’ve been poring over a mountain of research for months, looking for a way to increase interferon-gamma production in an effort to boost the body’s natural ability to inhibit the development of tumors. Your gut tells you that you’re close to an answer, but it’s just out of reach. Then one morning, while washing the exhaustion off in a nice hot shower, it hits you. Eureka! You think you’ve got it—you just need to refer back to that paper you read last week.

+

+ Dripping, you leap from the tub and land on the bath mat. Without even grabbing a towel, you pluck your mobile off the counter and head to the journal’s site, only to find yourself re-routed to a “lite” version of the website that shows you only general information about the publication and prompts you to subscribe.

+

+ Your fingers leave wet streaks across the screen as you frantically scroll down the page to find the inevitable link to “View Full Site” and click it. As the screen loads, you find yourself hovering 30,000 feet above a patchwork quilt of a homepage that could only have been designed by committee.

+

+ Several minutes of pinching, zooming, and typing later, you finally find the article, only to discover it’s a PDF and nearly impossible to read on your tiny screen. Dejected, you put down the phone and sulk back into the shower, hoping it will wash away your disappointment.

+

+ Sadly, browsing the web on mobile is all too often a frustrating (and occasionally dehumanizing) endeavor. But it doesn’t have to be.

+

+ In the pages of this very book, my friend Tim clearly outlines the steps you can (and indeed should) take to ensure that the sites you help create offer each user a fantastic experience, tailored to the capabilities of her device and respectful of her time, patience, and data limits. Don’t let his small town charm fool you: Tim knows this stuff inside and out. I learned a ton from this book and I know you will too.

+
+

+ Trust me when I say you need this book. Luckily, you can pick it up from Amazon, Peachpit, and Barnes & Noble.

+

+ Happy reading!

diff --git a/export/2012-10-30-slides-from-my-talk-at-how-interactive.md b/export/2012-10-30-slides-from-my-talk-at-how-interactive.md new file mode 100644 index 0000000..41ab5ff --- /dev/null +++ b/export/2012-10-30-slides-from-my-talk-at-how-interactive.md @@ -0,0 +1,19 @@ +--- +title: "Slides from my talk at HOW Interactive" +date: 2012-10-30 21:57:27 +comments: false +tags: + - "(x)HTML" + - "browsers" + - "CSS" + - "mobile" + - "presentations" + - "progressive enhancement" + - "web standards" +description: "These last two days have been a bit of a whirlwind, but I have had a great time meeting and talking to the attendees (and other speakers) here at the HOW Interactive conference in San Francisco . I gave my talk yesterday on progressive..." +permalink: /archives/slides-from-my-talk-at-how-interactive/ +--- + +

These last two days have been a bit of a whirlwind, but I have had a great time meeting and talking to the attendees (and other speakers) here at the HOW Interactive conference in San Francisco. I gave my talk yesterday on progressive enhancement (of course) and how it can make designing and devloping for mobile a little more sane. Here are the slides (which you can also see and download on Slideshare) form that talk:

+
+

I also provided attendees with a reading list. It’s the one I developed for a private training a few weeks ago with Jeremy and one which I will continue to update as I find more useful resources I want to share.

diff --git a/export/2013-01-15-welcome-jeff-bridgforth.md b/export/2013-01-15-welcome-jeff-bridgforth.md new file mode 100644 index 0000000..a045d2e --- /dev/null +++ b/export/2013-01-15-welcome-jeff-bridgforth.md @@ -0,0 +1,15 @@ +--- +title: "Welcome Jeff Bridgforth" +date: 2013-01-15 12:00:56 +comments: true +tags: + - "business" + - "coding" +description: "Today, I have the honor of introducing you to the newest Easy team member: Jeff Bridgforth . Jeff comes to us from Bonnier, where he built websites for Popular Science , Popular Photography , Saveur , and Parenting . Being a former..." +permalink: /archives/welcome-jeff-bridgforth/ +--- + +

Today, I have the honor of introducing you to the newest Easy team member: Jeff Bridgforth. Jeff comes to us from Bonnier, where he built websites for Popular Science, Popular Photography, Saveur, and Parenting. Being a former publishing guy myself, I’m delighted to be bringing someone on who has a solid grounding in content-rich websites.

+

True to his Twitter handle, Jeff is a web craftsman of the highest order. He breezed through the rigorous hurdles of my technical interview, yet is humble about his skills and is always eager to learn more. Jeff has demonstrated a great tenacity in his professional life thus far and we are excited to see how he puts that to work for us.

+

Jeff and his family are currently living outside of Orlando, Florida, but we hope to have them relocated to the Scenic City by the summer.

+

Welcome Jeff!

diff --git a/export/2013-02-02-responsive-tables.md b/export/2013-02-02-responsive-tables.md new file mode 100644 index 0000000..5d9e766 --- /dev/null +++ b/export/2013-02-02-responsive-tables.md @@ -0,0 +1,28 @@ +--- +title: "Responsive Tables" +date: 2013-02-02 17:32:27 +comments: true +tags: + - "(x)HTML" + - "accessibility" + - "CSS" + - "mobile" + - "progressive enhancement" + - "usability" + - "web standards" +description: "A few smart folks have already put together their thoughts on responsive tables and, while I think the proposed methods are pretty good, I think there might be room for improvement. As such, I’ve been tinkering for a while and came up..." +permalink: /archives/responsive-tables/ +--- + +

A few smart folks have already put together their thoughts on responsive tables and, while I think the proposed methods are pretty good, I think there might be room for improvement. As such, I’ve been tinkering for a while and came up with the following strategy when it comes to tables.

+

Step 1: Use data-* attributes to hold information about the column header(s) associated with the markup:

+
<table>
<thead>
<tr>
<th scope="col">Name</th>
<th scope="col">Email</th>
<th scope="col">Dept, Title</th>
<th scope="col">Phone</th>
</tr>
</thead>
<tbody>
<tr class="vcard">
<th scope="row" class="n" data-title="Name">
<b class="family-name">Smith</b>,
<b class="given-name">Laura</b>
</th>
<td data-title="Email">
<a class="email" href="mailto:laura.smith@domain.com">laura.smith@domain.com</a>
</td>
<td data-title="Dept, Title">Biology, Director</td>
<td class="tel" data-title="Phone">
<a href="tel:+1123456789">123-456-789</a>
</td>
</tr>
<tr class="vcard">
<th scope="row" class="n" data-title="Name">
<b class="family-name">Johnson</b>,
<b class="given-name">Ron</b>
</th>
<td data-title="Email">
<a class="email" href="mailto:ron.johnson@domain.com">ron.johnson@domain.com</a>
</td>
<td data-title="Dept, Title">Purchasing, Director</td>
<td class="tel" data-title="Phone">
<a href="tel:+11234567891">123-456-7891</a>
</td>
</tr>
</tbody>
</table>
+

Step 2: When the screen is below a certain threshold, set the table elements to display: block (thereby linearizing the table), hide the thead where assistive tech won’t see it, and use generated content to expose the data-* attributes. Here’s a snippet of SASS & Compass that does that:

+
// undo tables for small screens
// $break-4 is the px-width break at which you want to cut it off
@media (max-width: px-to-ems($break-4 - 1px)) {
// make each table separate from other ones
table {
border: 0;
@include trailing-border;
padding-bottom: 0;
display: block;
width: 100%;
// make sure captions are displayed
caption {
display: block;
}
/*
* wipe the thead from the face of the earth
* modern screen readers will expose the
* generated content
*/
thead {
display: none;
visibility: hidden;
}
/*
* make everything display block so it
* aligns vertically
*/
tbody, tr, th, td {
border: 0;
display: block;
padding: 0;
text-align: left;
white-space: normal;
}
// give each row a little space
tr {
@include trailer;
}
/* Labeling
* adding a data-title attribute to the cells
* lets us add text before the content to provide
* the missing context
*
* Markup:
* <td data-title="Column Header">Content Here</td>
*
* Display:
* Column Header: Content Here
*/
th[data-title]:before,
td[data-title]:before {
content: attr(data-title) ":\00A0";
font-weight: bold;
}
th:not([data-title]) {
font-weight: bold;
}
// hide empty cells
td:empty {
display: none;
}
}
}
+

We’ve been using this approach on a number of sites currently in development and it works really well. I put together a demo of this technique so you could play around with it yourself.

+

Notes:

+
    +
  1. I chose to use a data-* attribute (data-title) instead of title as the title attribute could be read out by assistive technology and in the case of the thead being available as well (when not display: none), resulting in the information being read twice (which is not ideal). That’s not a certainty however, so you could choose to go the title route if that’s your preference. I prefer to avoid the potential issue.
  2. +
  3. If you have multiple header rows over a cell (say a parent row and then a child row), I’d recommend making the data-title something like “Parent Header - Child Header.
  4. +
  5. While you could use JavaScript to auto-generate the data-title attributes by referencing the column headers, I feel this is information that should exist even if JavaScript is not available. You may disagree.
  6. +
diff --git a/export/2013-03-04-presto-change-o.md b/export/2013-03-04-presto-change-o.md new file mode 100644 index 0000000..1eed642 --- /dev/null +++ b/export/2013-03-04-presto-change-o.md @@ -0,0 +1,81 @@ +--- +title: "Presto Change-o" +date: 2013-03-04 15:11:00 +comments: false +tags: [] +description: "As you’ve probably heard, Opera has announced that they are abandoning their Presto rendering engine in favor of Webkit . CTO Håkon Wium Lie (you know, one of the guys who invented CSS ) has stated that this will allow Opera’s resources..." +permalink: /archives/presto-change-o/ +--- + +

+ As you’ve probably heard, Opera has announced that they are abandoning their Presto rendering engine in favor of Webkit. CTO Håkon Wium Lie (you know, one of the guys who invented CSS) has stated that this will allow Opera’s resources to assist with the continued development and improvement of Webkit:

+
+

+It makes more sense to have our experts working with the open source communities to further improve WebKit and Chromium, rather than developing our own rendering engine further. Opera will contribute to the WebKit and Chromium projects, and we have already submitted our first set of patches: to improve multi-column layout.

+
+

+ I am hopeful that Opera’s eagerness to help improve Webkit does not peter out over time. After all, it’s tempting to just take the Webkit core and just drop it into your own browser’s chrome, allowing your team to focus on the browser itself rather than the rendering engine. That is, after all, the dream from many authors’ and implementors’ standpoints: browsers compete on features, not standards support.

+

+ So, on the surface, this sounds great: Opera adopts Webkit. Chrome and Safari are already Webkit. Blackberry, Palm, Android, iOS, Symbian, and many more all use Webkit. That means fewer headaches for me, right? Well, not really. As PPK informed us years ago there is no one Webkit (on mobile or desktop).

+

+ Webkit is what you what you want it to be

+

+ You see, as an open source project, Webkit is constantly evolving. That’s a good thing as it means newly-proposed features roll quickly into it. As open source software, however, it is also modifiable. That means not only will you have differences between nightly builds of Webkit as new features are added and bugs are fixed, but each browser built on it can choose to jettison certain features or add ones of their own.

+

+ Apple, creators of the Webkit engine (who based their own work on the KHTML rendering engine) for example, have implemented the HTML5 form validation API for JavaScript, but have yet to expose it in the UI. Chrome, on the other hand, implements both. Similarly, Apple also disabled the “file” input type on iOS versions prior to 6. From an implementor standpoint, it’s nice because they can pick and choose their build to be tailored to highlight the strengths (or mask the weaknesses) of their browser or OS, but there is a dark side too: A company can choose to opt-out of specific standards in order to undermine the web as a platform.

+

+ One of the most egregious (in my opinion at least) examples of this is how Apple has treated multimedia on iOS. First, they made it impossible to cache audio & video files for offline use. You could make an argument that this keeps sites from bogging down a device with lots of files, but if you put a user in control of what’s cached & how much room it can take, that’s completely avoidable. But, when you consider that you can’t control the playback of media files solely via JavaScript (without requiring user interaction such as a tap), it becomes clear that Apple doesn’t really want competition from web-based apps. Want to make a web-based game that works offline on iOS and includes sound? Sorry!

+

+ Now Apple isn’t the only company to have tweaked their Webkit instances. HTC is pretty infamous for monkeying around with both Android in general and Webkit specifically too, and they are by no means alone in that. Along the way many of implementors have augmented Webkit for one reason or another and a few introduced serious bugs in the process. Here’s are just a few implementation issues I’ve come across, but there are hundreds more:

+ +

+ But the implementors are not the only ones introducing bugs.

+

+ Webkit itself suffers from numerous long-standing, serious bugs

+

+ Overall, Webkit is a pretty awesome rendering engine. It’s fast, lightweight, and does a damn good job laying out pages. But it’s not a panacea. It has it’s issues.

+

+ A brief sojourn through the Webkit bug tracker (and Chromium’s if you’re up for it) reveals a litany of well-documented, long-standing, serious usability, accessibility, and standards-implementation bugs that have not been touched. Here are a couple I’ve been tracking:

+ +

+ Now, as the number of companies using (and forking) the Webkit core has increased, the capacity to address these bugs has increased as well. Still, it seems bug fixes are always lagging behind new feature implementations. And then there’s always the folks who want to use a piece of open source software as-is, with little concern for making it better. Hopefully the addition of the incredibly smart developers from Opera to the mix (many of whom are accessibility experts) will bode well for issues like these to be remedied.

+

+ What does this mean for standards?

+

+ Webkit is pretty solid in its standards support. It has also been the testing ground for countless HTML5 and CSS3 proposals, many of which have gained traction. With Opera ditching Presto for Webkit, however, I’m a little concerned this may not bode well for the ratification of future standards.

+

+ You see, the web standards process requires that Proposed Recommendation at the W3C should have at least 2 interoperable implementations before it can become a Recommendation. With the Opera moving to Webkit, we are left with one less potential implementor as I don’t think you can consider the same Webkit implementation in Safari, Chrome, and Opera to be anything more than the one aggregate implementation. If you considered it as 3 independent implementations, it would essentially grant Webkit license to deem pretty much anything they come up with a standard, making a mockery of the whole process. And so we are more reliant than ever on the Internet Explorer, Mozilla, and Webkit teams being on the same page (and timeline) when it comes to implementations or the whole process will stall.

+

+ From what I gather—granted, it’s been a while since I sat through a W3C Working Group meeting, so my impression could be misguided—things at the W3C seem to be moving along much more smoothly than they have in the past, so perhaps my concern for the standards development process is unwarranted. I hope so, but only time will tell.

+

+ Further reading:

+ diff --git a/export/2013-04-16-orlando-in-a-whirlwind.md b/export/2013-04-16-orlando-in-a-whirlwind.md new file mode 100644 index 0000000..6bfea38 --- /dev/null +++ b/export/2013-04-16-orlando-in-a-whirlwind.md @@ -0,0 +1,23 @@ +--- +title: "Orlando in a Whirlwind" +date: 2013-04-16 15:51:00 +comments: true +tags: + - "conferences" + - "design" + - "mobile" + - "presentations" +description: "Last week was a bit of a whirlwind: Kelly and I flew to Orlando, co-hosted a vegan chili cook-off with the Filament Group and lost the coveted trophy Kelly so lovingly created, launched a refresh of the website for the Registrar’s..." +permalink: /archives/orlando-in-a-whirlwind/ +--- + +

+ Last week was a bit of a whirlwind: Kelly and I flew to Orlando, co-hosted a vegan chili cook-off with the Filament Group and lost the coveted trophy Kelly so lovingly created, launched a refresh of the website for the Registrar’s Office at Sewanee University, and I delivered a new talk and workshop at Breaking Development.

+

+ To be honest, I was a little nervous about my talk, Designing with Empathy. I am a developer. I live and breathe code and this was my first attempt at delivering a completely code-free talk. And one on a fairly touchy-feely subject to boot. To my amazement, the talk seemed to resonate with the audience. I received lots of excellent questions and had a handful of in-depth conversations with attendees after stepping off the dais. I could not be more pleased with the talk’s reception and am looking forward to delivering it a few more times this year at Beyond Tellerrand in Düsseldorf, Germany next month and Reasons to Be Creative in Brighton, UK in September. You can check out my deck on Slideshare (or thumb through it below). Luke W also took some excellent notes during my session. The video is forthcoming.

+
+
+

+ I also debuted a new workshop in Orlando: Planning Adaptive Interfaces. The idea was borne out of the corporate training work I’ve been doing and Retreats 4 Geeks’ mentoring sessions: A brief introduction to progressive enhancement and adaptive considerations followed by actual hands-on group activities where teams plan and sketch out different ways to experience common website conventions, taking into account screen real estate, browser capabilities, assistive technology and more. The response to the workshop was excellent as well and I appreciated the opportunity to get into the trenches with the teams and help them tackle complex cross-device problems. I’ll be taking this workshop on the road to UXLx in Lisbon, Portugal and Beyond Tellerrand in Düsseldorf, Germany next month and a few other as-yet-unannounced conferences in Europe this Fall.

+
+
Small groups planning adaptive interfaces
diff --git a/export/2013-07-07-apple-vs-the-open-web.md b/export/2013-07-07-apple-vs-the-open-web.md new file mode 100644 index 0000000..727fed3 --- /dev/null +++ b/export/2013-07-07-apple-vs-the-open-web.md @@ -0,0 +1,48 @@ +--- +title: "Apple vs. the Open Web" +date: 2013-07-07 22:39:00 +comments: true +tags: + - "business" + - "iOS" + - "usability" +description: "I’ll admit it: I never really got Siri." +permalink: /archives/apple-vs-the-open-web/ +--- + +

+ I’ll admit it: I never really got Siri.

+

+ To me, she’s always been a bit gimmicky. When she debuted on the iPhone 4S, I thought the voice recognition stuff was neat, but I didn’t see her as being anything close to the “digital assistant” Apple promised us. The idea was good, but the implementation was about as inspiring as my then 5-year-old Garmin. Oh, but she couldn’t give you turn by turn directions.

+

+ Sure, Siri’s gotten better, but not much.

+

+ Now, after reading Dan Kaplan’s excellent TechCrunch post lamenting the Siri that could have been, I realize how much better she could—nay should—be. You see, prior to being bought by Apple, Siri Assistant was pretty damn useful. She was a true digital assistant, capable of setting up a whole evening of fun for you by purchasing movie tickets, getting you dinner reservations, and even hailing you a cab. Pre buy-out, her creators even had plans to supercharge Siri by giving her predictive awareness (think Google Now). Dan offered a few examples of how this might work:

+
+ +
+

+ Given Siri’s previous capabilities and the plans her creators had, how did she become so lame?

+

+ Personally, I think the reason is simple: Apple doesn’t get the web. Sure there are a lot of incredibly smart and talented people who work at Apple who clearly do understand what the web is and how it works, but I think as a company Apple doesn’t. Or worse it does, but they can’t control it or monetize it, so they’re not interested.

+

+ It’s a feeling I’ve had for quite some time, but reading this piece (especially in light of Jeremy Keith’s fantastic post about the movement of many web companies toward creating more walled gardens) really convinced me. I mean take a look at Maps.

+

+ Prior to iOS 6, Google Maps was the de-facto mapping and directions app. It offered your standard driving and walking directions, but it also offered public transit directions based on public data (much of which Google has coalesced into Google Maps Transit). Now, with Apple’s homegrown Maps, if you want to get transit directions, you need to download a separate app from the App Store. Travel a lot? Try 8 differnet apps. Or 10.

+

+ Instead of using existing APIs to make transit directions native to Maps, they opted to fragment the experience.*

+
+
A quick direction search using Maps and the resulting screen when I route via public transit.
+

+ Sure, you might argue that the Maps team may have had to cut the transit feature due to time or budget constraints, but they found the time to make the maps three-dimensional. Just sayin’.

+

+ Clearly Apple could have used any of the publicly available transit APIs to accomplish this task, but they didn’t. The same goes for Siri. There are a ton of freely-available resources out there to collect information and then do something useful with it—to truly allow Siri to become the digital assistant of our dreams—but Apple doesn’t seem to have any interest. And I think their software is suffering for it.

+

+ * I’m all for de-coupling functionality in order to scale application logic, but de-coupling (a.k.a. fragmenting) the user experience is downright baffling. Especially for a company that prides themselves on both design and user experience.

diff --git a/export/2013-07-10-evernote-for-interface-inventories.md b/export/2013-07-10-evernote-for-interface-inventories.md new file mode 100644 index 0000000..707158f --- /dev/null +++ b/export/2013-07-10-evernote-for-interface-inventories.md @@ -0,0 +1,54 @@ +--- +title: "Evernote for Interface Inventories" +date: 2013-07-10 12:00:00 +comments: true +tags: + - "design" +description: "Earlier today, Brad Frost posted a great piece touting the usefulness of interface inventories . I’ll give him the floor to explain:" +permalink: /archives/evernote-for-interface-inventories/ +--- + +

+ Earlier today, Brad Frost posted a great piece touting the usefulness of interface inventories. I’ll give him the floor to explain:

+
+

+ An interface inventory is similar to a content inventory, only instead of sifting through and categorizing content, you’re taking stock and categorizing the components making up your website, app, intranet, hoobadyboop, or whatever (it doesn’t matter). An interface inventory is a comprehensive collection of the bits and pieces that make up your interface.

+
+

+ Interface inventories are a great way to take stock of the design consistency (or inconsistency) of your site and are a typical first step in creating a pattern library. After all, you need to know what patterns you have before you can document them.

+

+ In his article, Brad offers a Keynote template for gathering your screenshots, but I have been living in Evernote lately, so I wanted to take a moment to show off how you might use Evernote’s tools to simplify the process of building an interface inventory. If you don’t have an Evernote account, you can get a free one here.

+

+ One method of getting screenshots into Evernote is using Skitch. Skitch was originally developed by plasq, but was acquired by Evernote in 2011. It is a general purpose screenshotting tool that supports annotations, etc.

+
+
+

+ Here’s a quick run-down on how to use Skitch to build your interface inventory:

+

+Step 1: Create a new Notebook for your interface inventory and adjust the Skitch preferences so it uses it.

+
+
+

+Step 2: Click the “Screen Snap” button and adjust the crop tool to contain the interface object you want to capture.

+
+
Position your crosshairs and…
+
+
…your snap is captured in Skitch
+

+Step 3: Rinse & repeat.

+

+ When Skitch syncs up to Evernote, your screenshots will magically appear on any device you have.

+

+ Another route to go involves the Evernote Web Clipper. This add-on is available for pretty much every browser and is available as a bookmarklet to boot.

+

+ Using the Web Clipper is every bit as simple as it is with Skitch. Possibly moreso. One bonus is that you have the benefit of being able to direct your individual clips to different notebooks within Evernote. Simply click the Web Clipper button, select the area to clip, and choose where you want the clip to go. You can even add any tags you might find useful.

+
+
+

+ Either way you go, you will eventually end up with a nice little collection of interface artifacts in Evernote which are, in turn, available on the web and on any device where you have Evernote installed.

+
+
All your interface are belong to us.
+

+ The nice thing about using a tool like Evernote for creating an interface inventory is that you can share this notebook with your colleagues to speed up the documentation. Simply divide up your interface into categories and go on a scavenger hunt. All of the snaps will sync to the same place and become a part of your interface inventory. Done and done.

+

+ Happy snapping!

diff --git a/export/2013-07-15-designing-with-empathy-at-btconf.md b/export/2013-07-15-designing-with-empathy-at-btconf.md new file mode 100644 index 0000000..a2757c7 --- /dev/null +++ b/export/2013-07-15-designing-with-empathy-at-btconf.md @@ -0,0 +1,29 @@ +--- +title: "Designing with Empathy at #btconf" +date: 2013-07-15 06:11:00 +comments: true +tags: + - "coding" + - "conferences" + - "design" + - "JavaScript" + - "mobile" + - "optimization & performance" + - "presentations" + - "progressive enhancement" + - "usability" + - "web standards" +description: "A little over a month ago I had the pleasure of speaking at Beyond Tellerrand in Düsseldorf, Germany. It was my second time speaking (and attending) the conference and I can honestly say it’s easily one of my favorites. Marc Thiele does..." +permalink: /archives/designing-with-empathy-at-btconf/ +--- + +

+ A little over a month ago I had the pleasure of speaking at Beyond Tellerrand in Düsseldorf, Germany. It was my second time speaking (and attending) the conference and I can honestly say it’s easily one of my favorites. Marc Thiele does an amazing job organizing the event and the speaker roster was nothing short of amazing.

+

+ In an effort to continue spreading my wings beyond talking about code, I delivered a talk about empathy. Empathy is something I’ve written about here before both explicitly and as an underlying motivation for progressive enhancement and overall usability. Empathy is something I feel we need deperately in our lives and especially in our work—empathy for both our users and our co-workers.

+

+ Anyway, Marc was kind enough to record the talk. Let me know what you think.

+
+
Video of my talk “Designing with Empathy” from Beyond Tellerrand
+
+
Slides from “Designing with Empathy” as delivered at Beyond Tellerrand
diff --git a/export/2013-07-22-jail-ing-images-in-expressionengine.md b/export/2013-07-22-jail-ing-images-in-expressionengine.md new file mode 100644 index 0000000..3cad347 --- /dev/null +++ b/export/2013-07-22-jail-ing-images-in-expressionengine.md @@ -0,0 +1,47 @@ +--- +title: "JAIL-ing images in ExpressionEngine" +date: 2013-07-22 18:04:00 +comments: true +tags: + - "(x)HTML" + - "content management" + - "ExpressionEngine" + - "JavaScript" + - "optimization & performance" +description: "A while back I came across a link to Sebastiano Armeli-Battana’s jQuery Asynchronous Image Loader ( JAIL ) and filed it away to revisit when I had some time. I finally made some time this weekend." +permalink: /archives/jail-ing-images-in-expressionengine/ +--- + +

+ A while back I came across a link to Sebastiano Armeli-Battana’s jQuery Asynchronous Image Loader (JAIL) and filed it away to revisit when I had some time. I finally made some time this weekend.

+

+JAIL’s a cool little script that takes care of lazy loading images for you in order to speed up initial page rendering. To use it, you implement the following markup pattern:

+

+

<img class="jail" src="blank.gif" data-src="foo.png" alt=""/>
<noscript>
<img src="foo.png" alt=""/>
<noscript>
+

+ This is pretty ingenious actually. Without JS, the actual image is served up, but with JavaScript, the blank image is displayed until JAIL lazy loads the real image path stored in the data-src attribute. You initialize JAIL like this:

+

+

$(function(){
$('img.jail').jail();
});
view raw jail-init.js hosted with ❤ by GitHub
+

+ Simple, right? Well, yes and no.

+

+ On a site that isn’t updated frequently and where a skilled front-end coder is involved, this is cake. That, however, is seldom the reality. Heck, even on this blog, remembering to use that pattern while authoring content in the backend is not terribly likely. I needed a way to make it easier. So I automated JAIL as an ExpressionEngine plug-in.

+

+ With this plug-in I can automatically enable (or remove) JAIL at the template level with a simple tag pair: exp:easy_jail:prep

+

+

{exp:easy_jail:prep}
{body}
{/exp:easy_jail:prep}
+

+ The plugin will hunt for any image elements inside the tag pair and convert them to use the JAIL markup pattern. By default, it uses a base-64 encoded representation of a blank GIF (a.k.a. a Data URI) to reduce the number of requests, but you can override that with the path to your own image (or a different Data URI) using the blank_img property. You can also customize the class used for the blank image using the class_name property.

+

+

{exp:easy_jail:prep blank_img="/i/blank.gif" class_name="my_class"}
{body}
{/exp:easy_jail:prep}
+

+ Then it’s just a matter of including the JAIL JavaScript code and executing it. You can, of course, include Sebastiano’s script and your JAIL config in your own JavaScript build, but the plug-in also includes a convenience function to drop it in for you. Simply add the exp:easy_jail:js tag before the close of the body element (after jQuery of course):

+

+

<script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.1/jquery.min.js"></script>
<script>window.jQuery || document.write('<script src="/j/jquery.js"><\/script>')</script>
{exp:easy_jail:js}
+

+ As with the exp:easy_jail:prep tag, you can customize the JavaScript output using the class_name property to tell JAIL what to lazy load. You can also customize the JAIL configuraiton using the config property. Just pass in a valid JSON object describing the configuration you want. JAIL is pretty darn configurable. There are a ton of options available, but the most intriguing to me currently is offset. We’re using it here on the blog to load images when you scroll to within 300px of the top of the image. Here’s how you’d do that using the plug-in:

+

+

<script src="//ajax.googleapis.com/ajax/libs/jquery/1.10.1/jquery.min.js"></script>
<script>window.jQuery || document.write('<script src="/j/jquery.js"><\/script>')</script>
{exp:easy_jail:js config="{offset:300}"}
+

+ And there you have it. Simple, lazy loaded images without having to train content editors to author the relatively complex markup pattern. If you want to have a play, feel free to grab the code from Github or you can fork it and help us to make it even more useful.

+

diff --git a/export/2013-08-01-the-true-cost-of-progressive-enhancement.md b/export/2013-08-01-the-true-cost-of-progressive-enhancement.md new file mode 100644 index 0000000..b2c9507 --- /dev/null +++ b/export/2013-08-01-the-true-cost-of-progressive-enhancement.md @@ -0,0 +1,84 @@ +--- +title: "The True Cost of Progressive Enhancement" +date: 2013-08-01 12:35:00 +comments: true +tags: + - "accessibility" + - "business" + - "client relations" + - "coding" + - "CSS" + - "JavaScript" + - "mobile" + - "progressive enhancement" + - "web standards" +description: "When you’ve been evangelizing progressive enhancement for as long as we have, you invariably come across skeptics. Take this comment on Tim Kadlec’s recent (and well-argued) post about designing experiences that work without JavaScript :" +permalink: /archives/the-true-cost-of-progressive-enhancement/ +--- + +

+ When you’ve been evangelizing progressive enhancement for as long as we have, you invariably come across skeptics. Take this comment on Tim Kadlec’s recent (and well-argued) post about designing experiences that work without JavaScript:

+
+

+ This is all fine and dandy, but not very real world. A cost-benefit analysis has to happen – what does that next user/visitor cost, and more importantly earn you? This idealistic approach would leave most broke if they had to consider “every user” when building a site. That's why clothes come in small, medium, large, and extra large. Most of us have to buy them that way because not everyone can afford a tailor made suit, much less an entire wardrobe. Your approach only works for those who can see the return.

+
+

+ Tim’s response was dead-on:

+
+

+ I think that's where the difference between “support” and “optimization” comes into play. I'm certainly not saying to go out and buy every device under the sun, test on them, make sure things look and behave the same. You don't necessarily have to optimize for all these different devices and scenarios (that's where the cost-benefit analysis has to come in), but it's often not very time consuming to at least support them on some level.

+

+ Progressive enhancement can get you a long way towards accomplishing that goal. Sometimes it's as simple as doing something like “cutting the mustard” to exclude older devices and browsers that might choke on advanced JS from having to try and deal with that. The experience isn't the same, but if you've used progressive enhancement to make sure the markup is solid and not reliant on the JavaScript, it's at least something that is usable for them.

+
+

+ I’ve had similar conversations innumerable times in person, on conference calls, in blog comments, and (of course) on Twitter. Sometimes I can win the skeptics over with a well-reasoned philosophical argument, but often I need to start filling in numbers.

+

+ Each project is different, so I’m often reluctant to say “progressive enhancement costs X.” It’s also part-and-parcel of everything we do here at Easy, so it’s damned near impossible to say what a project would cost without progressive enhancement. That said, we’ve been doing this long enough to have a few stories worth sharing. Here are two anecdotes from real projects we’ve worked on.

+

+ Backing Off From the Bleeding Edge

+

+ Some time ago we built a Chrome app for WikiHow. As a Chrome app and a show-piece for the new app store, our client wanted it to have fancy CSS3 animations & transitions, web fonts, a WebDB “back-end”, offline support, and lots of other HTML5-y bells and whistles. And, as our target was a single browser, we relented when asked to go the single-page app route. The app was built to degrade gracefully (it blocked non-WebKit browsers), but it was not progressively enhanced.

+

+ Skip ahead about a year and our client returned to add support for Firefox and IE9+. Oh boy.

+

+ Having built the site purely for WebKit, it was a bit of the challenge. In addition to implementation differences with the experimental CSS features, we also had to deal with the DOM and JavaScript API variance among the browsers. But the single biggest issue we ran into was the lack of WebDB support in Firefox and IE. You see, in the intervening year, WebDB had been abandoned at the W3C because of pushback (primarily from Mozilla and Microsoft). It was not available in either Firefox or IE, nor would it ever be. And indexedDB, its replacement, had yet to be implemented in any production browser. So we ended up writing a wrapper on top of localStorage that looked a lot like SQL, which allowed us to avoid re-writing the bulk of the app. Coincidentally, it also made the app a lot faster.

+

+ The total cost of the new compatibility project was around 40% of the original budget required to build the app the first time around. Without access to an alternate timeline I can’t be certain, but my experience tells me it would have added less than 40% to the original project had we been given the leeway to build it using progressive enhancement. And the end result would have been even better because it would have been able to function without JavaScript.

+

+ Based on other conversations I’ve had with folks, the 40% number seems pretty accurate; possibly even a bit low. I remember a conversation I had six or seven years ago about Google Maps. When the team originally built Maps—in all of its Ajax-y glory—they didn’t make it very accessible and it required JavaScript. According to the source (who I have long forgotten), it took them almost twice as long to retrofit Maps than it would have taken had they built it following progressive enhancement from the ground up. As it’s purely anecdotal, you should take that with a grain of salt, but it’s food for thought.

+

+ Switching gears, let me share a success story around building things the right way.

+

+ Smart Code, Dumb Phones

+

+ In early 2012 we began working with a client who was struggling with the security of their mobile apps. They had numerous native apps that all followed the common convention of using a web service to authenticate users. They are a very security-concious organization and this setup was creating a bottleneck in deploying new security features. In order to roll out any new authentication method, error state, etc. they had to go through an excrutiatingly long, painful, multi-step process:

+
    +
  1. + Implement the new security feature,
  2. +
  3. + Expose it via the web service,
  4. +
  5. + Update each app to use the new web service (which might include UI changes, etc.),
  6. +
  7. + Submit each app for approval, and finally
  8. +
  9. + Hope their users downloaded the new version of the app.
  10. +
+

+ They brought us in to re-imagine the authentication flow as a web-based process that would launch inside of each app and handle letting the app know if and when the user had successfully logged in. This approach meant they could roll out new security features immediately because the apps and the authentication flow would be very loosely coupled. It would be a huge win for everyone involved.

+

+ Despite the fact that the project was aimed at handling authentication for mobile apps on a few particular platforms, we built the screens following progressive enhancement. The layouts were responsive from tiny screens all the way up to large ones and we implemented HTML5 and JavaScript in completely unobtrusive ways in order to take advantage of cool new things like form validation while still keeping the file sizes small and ensuring the pages would function in the absence or either technology.

+

+ A few months after completing the project, our client came back to us with interest in rolling out the authentication flow to their m-dot users. They gave us a list of nearly 1400 unique User Agent strings that had been used on the login screen over a two-day period and asked if we could handle it. We parsed the list (with the help of a little script I cooked up) and were able to put together a more manageable list of aggregate devices and device types to use in our testing. It was something like 25 devices that would cover roughly 97% of the spectrum. We were comfortable assuming that fixing issues in 97% of the devices listed would likely also cover the other 3%, but were prepared to fix any additional issues if they cropped up.

+

+ Our budget for this project was about 33% of the budget of the original project.

+

+ Much to our surprise, when all was said and done, we came in at roughly half of that budget in terms of actual hours spent and we completed the project in about half the time we expected. It was awesome for us because we saved our client money, which made us look good. It was awesome for our client too, because they were able to save serious money on a project (which rarely happens in the corporate world, at least in my experience).

+

+ It’s worth noting that this accomplishment had nothing to do with our bug-squashing prowess or our speed… progressive enhancement just works. We were dealing with some heinous old browsers too—think Blackberry 4 and OpenWave—and they really didn’t present much of a challenge. So, for a very modest sum, we were able to quickly roll out additional support for over 1000 devices (and probably thousands more that didn’t make the list) and that created a huge opportunity for our client to attract and retain new customers.

+

+ Lessons Learned

+

+ We’ve been practicing the art of progressive enhancement for a long time. It’s deeply-ingrained in our process and part of who we are as a company. That often makes it difficult for us to put hard numbers against the cost of not doing progressive enhancement and the financial savings of doing things the way we almost always do. Hopefully, these two small case studies help illuminate things a bit for those who may still be a bit skeptical. 

+

+ Do you have any case studies or anecdotes you can share? We'd love to hear them.

diff --git a/export/2013-08-12-over-90-of-newspaper-reading-is-in-print.md b/export/2013-08-12-over-90-of-newspaper-reading-is-in-print.md new file mode 100644 index 0000000..1cf0cb4 --- /dev/null +++ b/export/2013-08-12-over-90-of-newspaper-reading-is-in-print.md @@ -0,0 +1,40 @@ +--- +title: "Study: Over 90% of Newspaper Reading is in Print" +date: 2013-08-12 15:45:00 +comments: true +tags: + - "accessibility" + - "business" + - "culture & society" +description: "A recent study in of the UK came to the conclusion that over 90% of newspaper reading is still taking place in print. Their findings are based on a survey of 12 UK newspapers during the period of 2007–2011, examining National Readership..." +permalink: /archives/over-90-of-newspaper-reading-is-in-print/ +--- + +

+ A recent study in of the UK came to the conclusion that over 90% of newspaper reading is still taking place in print. Their findings are based on a survey of 12 UK newspapers during the period of 2007–2011, examining National Readership Survey data, circulation audits from the Audit Bureau of Circulations, and Neilsen data regarding web-based engagement.

+

+ In reviewing their domestic readership, comparing time spent reading online versus time spent reading print editions, the study found that 96.7% of reading time was spent with the print edition. Of course the “quality” of said publications varied greatly and that sad figure was even sadder for some online editions: Readers of “tabloid” newspapers spent, on average, a depressing 1.16% of their time reading the paper online. On the flip side, proper news outlets that are not behind a paywall saw 6.98% of their readership online. Paywalled online editions were all over the place: 4.1% for the Financial Times and only 0.83% for the Times.

+

+ I think the most interesting stat, however, was that the overall reading of some of these publications actually declined over the study period. In fact, the total time people spent reading the Independent went down 30.88% between 2007 and 2011.

+

+ Due to limitations of the data from the Audit Bureau of Circulations, the study was not able to include circulation data via apps and the meager data they could get was mostly self-reported and had to do mainly with page requests. They could not get data on reading time spent with the various newspapers’ apps.

+

+ Now granted, the data they used for the study is two years old at this point and some of the newspapers have redesigned their websites since this time, but the study got me wondering:

+ +

+ Having come from a journalism background, I am incredibly interested in seeing where things go. I have mixed feelings about print versus digital. On one hand, I have not subscribed to a newspaper for as long as I can remember. I only read them occasionally while traveling; most of my reading takes place digitally (online or at least via online sources). On the other hand, I do see print editions as being some people’s only access to what is going on around them.

+

+ It will be interesting to see how this all shakes out.

+

+Update: I made a small tweak to the declining readership paragraph per Neil’s correction in the comments.

diff --git a/export/2013-09-16-zoom-layouts-v2.md b/export/2013-09-16-zoom-layouts-v2.md new file mode 100644 index 0000000..d0eb895 --- /dev/null +++ b/export/2013-09-16-zoom-layouts-v2.md @@ -0,0 +1,49 @@ +--- +title: "Zoom Layouts v2" +date: 2013-09-16 15:56:00 +comments: true +tags: + - "accessibility" + - "coding" + - "CSS" + - "design" + - "mobile" + - "web standards" +description: "Some of you might find it hard to believe, but I began working with adaptive layouts way back in 2005. I was working on project for the Connecticut Department of Transportation and my primary design made heavy use of fixed positioning..." +permalink: /archives/zoom-layouts-v2/ +--- + +

+ Some of you might find it hard to believe, but I began working with adaptive layouts way back in 2005. I was working on project for the Connecticut Department of Transportation and my primary design made heavy use of fixed positioning and white space. Sadly this is the only screenshot I have of the now-defunct project:

+
+
+

+ The layout really started to break down on smaller screens—we had quite a few 800x600 monitors to deal with back in the day—so, inspired by Joe Clark’s A List Apart article “Big, Stark & Chunky,” I created an alternate stylesheet that rearranged the page layout, enlarged the text, and improved the reading experience. Sadly, I don’t have a screenshot of what that looked like, but here’s a decent approximation (sans background images), courtesy of the Wayback Machine:

+
+
+

+ We didn’t have media queries in those days, so I relied on JavaScript to do the stylesheet switching. It was pretty good work for the time, but I see a ton of things I would do differently if I had the opportunity to revisit it.

+

+ So why am I bringing this up? Well, I remembered Joe’s article the other day and was thinking about how relevant it is in this, the age of responsive design. I think the idea of high-contrast zoom layouts is incredibly useful, but not just for mobile. When you start to think about the other end of the spectrum—giant high-definition televisions sitting 8-10 feet from your face—zoom layouts become really useful again.

+

+ To that end, I have been thinking quite a bit about the viewport-based units available to us in modern browsers and how we can use them to create automated zoom layouts by simply increasing the font size of the body element. Consider this bit of code:

+

+@media screen and (min-width: 64em) {
+  body {
+    font-size: 1.5625vw;
+  }
+}
+
+

+ This tiny bit of CSS can ensure that the entire layout is proportionately scaled up based on the screen size being used to access it. To figure out how this bit of code would fit best into your own work, use this formula (replace “X” with your max width size in ems):

+

+@media screen and (min-width: 64em) {
+  body {
+    font-size: 1.5625vw;
+  }
+}
+
+

+ The site I developed this technique for is not live yet, so I threw together a simple demo on Codepen. Note: Chrome currently requires a forced repaint on window resize to make it shrink or enlarge the layout. Hopefully that bug will be fixed soon.

+

+ I’m still feeling my way around this technique, but I am intrigued by the possibilities it holds. What do you think?

diff --git a/export/2014-01-21-a-web-for-everyone.md b/export/2014-01-21-a-web-for-everyone.md new file mode 100644 index 0000000..4e144d9 --- /dev/null +++ b/export/2014-01-21-a-web-for-everyone.md @@ -0,0 +1,39 @@ +--- +title: "A Web For Everyone" +date: 2014-01-21 10:31:00 +comments: true +tags: + - "accessibility" + - "books & articles" + - "culture & society" + - "design" + - "web standards" +description: "I was an only child, so it shouldn't come as a surprise that I grew up thinking the world revolved around me. In fact, I'll be the first to admit that I was a pretty selfish kid. Well behaved, certainly, but not terribly concerned with..." +permalink: /archives/a-web-for-everyone/ +--- + +

+ I was an only child, so it shouldn't come as a surprise that I grew up thinking the world revolved around me. In fact, I'll be the first to admit that I was a pretty selfish kid. Well behaved, certainly, but not terribly concerned with how my actions affected others.

+

+ As an only child, the Golden Rule my grandparents insisted was so important—Do unto others as you would have them do unto you—didn't really resonate. But I was a kid, what did I know? I was sheltered. I was young. I was the sole beneficiary of my parents' love, time, and money. I had a pretty good life, but I lacked perspective.

+

+ I like to think I've grown immensely in the intervening years. Through my work, travel, and moving around a lot, I've experienced dozens of cultures, and I've met hundreds of new people, each with their own life experiences, needs, and desires. Exposure to their unique perspectives has broadened my own and helped me break down the psychological barriers I maintained between me and the “others.

+

+ But it wasn't until I started working on the web that I came to a full understanding of the importance of the Golden Rule. Prior to becoming a developer, the ramifications of my decisions were fairly limited. But on the web, every decision I make can have a profound effect on hundreds of thousands (if not millions) of people's lives. I can make checking into a flight a breeze, or I can make it a living hell.

+

+ That's a lot of power. And to quote Stan Lee: “With great power comes great responsibility.

+

+ My mom always told me that if you choose to do something, you should do it well, so I made it my mission to make the web an easy-to-use, friendly, and accessible place. I chose to make the Golden Rule central to my work.

+

+ As schmaltzy and self-aggrandizing as all that may sound, it's also pretty shrewd. The Golden Rule can do wonders for your business. After all, what is good customer service if not treating someone like a human being worthy of your time, consideration, and respect? If we spend every day looking for new ways to make our customers' lives better, we'll create a lasting legacy and build a strong base of customer advocates along the way.

+

+ A commitment to universal accessibility is the highest form of customer service. It recognizes that we all have one special need or another at one time or another in our lives, and that fact should not preclude us from experiencing all the web has to offer. It's about providing everyone with equal opportunity to engage with your brand experience, even though they may do so in different ways. It breaks down the barriers between “us” and “them” and recognizes the humanity in our customers.

+

+ And it's really not that hard.

+

+ In the pages that follow, Whitney and Sarah beautifully lay out the case for accessibility, show you what it looks like, and demonstrate just how simple it is to achieve. They introduce us to a series of personas—Trevor, Emily, Jacob, Lea, Vishnu, Steven, Maria, and Carol—and help us effortlessly slip into each of their shoes, to see the struggles they experience when using the web, and to recognize our own needs and desires in their own.

+

+ In a time when many of us are scrambling to keep up with technological advancements and the opportunities they create, this book grounds us in what really matters: people. This book is a roadmap to providing incredible customer service and realizing the Golden Rule in our work and—much like the code we write and experiences we design—the ripple effect it generates is sure to bring about a more equitable web.

+
+

+ What you have just read is the Foreword I wrote for A Web For Everyone: Designing Accessible User Experiences by Sarah Horton & Whitney Quesenbery. If you work on the web at all, you need this book. It’s an amazingly easy read that brings humanity to an often dry subject and I am very thankful to have been a part of it.

diff --git a/export/2014-02-26-filemaker.md b/export/2014-02-26-filemaker.md new file mode 100644 index 0000000..196d3f0 --- /dev/null +++ b/export/2014-02-26-filemaker.md @@ -0,0 +1,24 @@ +--- +title: "Filemaker" +date: 2014-02-26 20:44:00 +comments: true +tags: + - "conferences" + - "culture & society" +description: "I started attending SXSW in 1997 as a music journalist. I ran a small indie music & entertainment rag in Florida at the time and was invited by one of my publicist friends (I’m looking at you, Rey ) to crash in his room and check out..." +permalink: /archives/filemaker/ +--- + +

+ I started attending SXSW in 1997 as a music journalist. I ran a small indie music & entertainment rag in Florida at the time and was invited by one of my publicist friends (I’m looking at you, Rey) to crash in his room and check out the festival/conference. I scored a press badge and saw some amazing shows, but Interactive wasn’t really on my radar.

+

+ Two years later, my little publication became a media sponsor of SXSW and I got a Platinum badge, granting me access to everything SXSW had to offer. I didn’t attend any of the Interactive panels—I was far more interested in seeing Tom Waits’ first live performance in 10 years, meeting Richard Linklater and Robert Rodriguez, and interviewing Janeane Garofalo—but I did check out the trade show.

+

+ At the time, the trade show was mixed: music, film & interactive all lumped in together. (The conferences overlapped more at the time as well.) It was an interesting time because many labels were experimenting with interactive CDs and such, but MP3s and digital downloads were still pretty uncommon. Napster had just launched that year and only one major-label band at the time—They Might Be Giants—had the foresight to issue a digital-only album: Long Tall Weekend.

+

+ Amid all of the music and film-related hubub, I made my way over to the one corner devoted to Interactive’s vendors. While perusing the wares and looking for cool swag (of which there was none), I discovered a guy hocking something called a “Content Management System”. It sounded marvelous. I had been doing static (framed, of course) HTML versions of my magazine for about two years at that point and the idea of being able to enter and maintain the content in a more dynamic and flexible format was mind-blowing. I have no idea what the software was called, but the back-end was Filemaker. I bought it, of course. It wasn’t until I got back to my hotel rom that I realized Filemaker was Mac-only. I was on Windows. Cue the sad trombone.

+

+ I never once ended up installing or using that early CMS, but it sowed a seed in my mind of the possibilities for a website and I began to take my practice of web design more seriously. I taught myself PHP and MySQL and just kept going. And I owe it all to that guy and his Filemaker CMS.

+
+

+It’s hard to believe that SXSW Interactive is 20 years old. Reading through the remarks and stories in this awesome piece from Fast Company, I felt inspired to share some of my SXSW stories. This is the first.

diff --git a/export/2014-02-27-hallways.md b/export/2014-02-27-hallways.md new file mode 100644 index 0000000..c74e0e2 --- /dev/null +++ b/export/2014-02-27-hallways.md @@ -0,0 +1,29 @@ +--- +title: "Hallways" +date: 2014-02-27 22:09:00 +comments: true +tags: + - "awards" + - "conferences" + - "culture & society" +description: "I stopped attending SXSW as a journalist in 2000. I’d gotten pretty burned out running the magazine, so I decided to take a break and focus on my web work. Little did I know, 5 years later I’d be back because a site I built was a..." +permalink: /archives/hallways/ +--- + +

+ I stopped attending SXSW as a journalist in 2000. I’d gotten pretty burned out running the magazine, so I decided to take a break and focus on my web work. Little did I know, 5 years later I’d be back because a site I built was a finalist in the Interactive Awards.

+

+ At the time, the panels for Interactive occupied roughly 3-4 rooms upstairs, in the far corner of the Convention Center. We were the AV club to Music & Film’s jocks and cool kids in the high school cafeteria. But, to me, walking into that corner was like swimming up to a coral reef teeming with schools of incredible fish. I recognized so many of our industry’s luminaries as I floated through: Eric MeyerJeffrey ZeldmanJason FriedTantek Çelik … These were people whose blog posts and articles had helped me solve issues I was having, people that helped me hone my craft, people that were indirectly responsible for me being there as a finalist in the awards. And unlike the reef fish, they didn’t spook when I saddled up to them and said hello.

+
+
+

+ Younger versions of ourselves: Ian Lloyd, Ethan Marcotte, Andy Clarke, Andy Budd, Glenda Sims, Jeffrey Zeldman, Richard Rutter, Shaun Inman, Rob Weychert, Faruk Ateş, Jon Hicks, and more sprawled on the floor.
+Photo credit: Jeremy Flint.

+
+

+ Everyone was incredibly friendly and I was amazed when they invited me to join them as they sat on the floor and leaned against the walls between and during some of the sessions. The hallway became our meeting place and I began to meet more amazing people, many of whom were just starting to make a splash in our then-young industry: Jeremy Keith, Andy Budd, Richard Rutter, Matt Mullenweg, Jason Santa Maria, Jon Hicks, Rob Weychert, Ethan Marcotte, Ian Lloyd, Cindy Li, and Faruk Ateş to name but a few. Together we bonded on those dirty, industrially-carpetted conventions center floors and those relationships became friendships and grew into new businesses and ventures.

+

+ I am incredibly thankful for the opportunity I had to go to SXSW that year. I’m thankful for the carpet and the hallways. And I am ever so thankful for the friends I made there, friends that I still hold dear nearly 10 years later.

+
+

+It’s hard to believe that SXSW Interactive is 20 years old. Reading through the remarks and stories in this awesome piece from Fast Company, I felt inspired to share some of my SXSW stories. This is the second.

diff --git a/export/2014-02-28-the-hampton.md b/export/2014-02-28-the-hampton.md new file mode 100644 index 0000000..5cc4d6d --- /dev/null +++ b/export/2014-02-28-the-hampton.md @@ -0,0 +1,58 @@ +--- +title: "The Hampton" +date: 2014-02-28 10:19:00 +comments: true +tags: + - "conferences" + - "culture & society" + - "social networks" +description: "For a few years in the late aughts, the place to stay during SXSW Interactive was the Hampton Inn at San Jacinto and 2nd. There were 3 main reasons for this: 1) proximity to the Convention Center, 2) free breakfast, and 3) happy hour..." +permalink: /archives/the-hampton/ +--- + +

+ For a few years in the late aughts, the place to stay during SXSW Interactive was the Hampton Inn at San Jacinto and 2nd. There were 3 main reasons for this: 1) proximity to the Convention Center, 2) free breakfast, and 3) happy hour and a spacious veranda on which to enjoy it.

+

+ I’m sure a handful of the other nearby hotels offered similar amenities (though perhaps not the veranda), but for whatever reason we all seemed to gravitate to the Hampton.

+

+ Over the years, I had a the pleasure of meeting and enjoying both company and conversation with dozens of the web’s brightest minds, but my favorite memories from that particular hotel revolved around food.

+

+ Breakfast was always a big draw and quickly began to take on an almost tailgating-esque significance. We’d meet in the breakfast area, load up on breakfast meats, eggs, and pastries and then compare notes and plan out our day. And if nothing interesting was on deck for a bit, we’d pull out our DSes and engage in some pretty epic races in Mario Kart. Shaun Inman usually won, but Dan Cederholm, Jason Santa Maria, and Rob Weychert were pretty good too. I rarely placed.

+

+ We didn’t compete for anything but bragging rights. And occasionally bacon.

+
+
+

+ Molly Holzschlag, Faruk Ateş, Jeremy Keith, and Jessica Spengler preparing to take on the day.
+Photo credit: Jenifer Hanen.

+
+
+

+ On particularly awesome Hampton institution was wine & cheese. But before I get into what it was and its significance, let me first talk about its lovely host.

+

+ When I attended SXSW 2005, I didn’t know anyone. Sure, I followed a bunch of people’s blogs and articles, but I didn’t really know any of them. I went to SXSW hoping to change that and was successful beyond my wildest dreams. One of the most amazing people I met at SXSW has never been a household name even though she was wildly ahead of her time: Jenifer Hanen (or Ms. Jen as she’s affectionately known).

+

+ When I first saw Me. Jen, I stopped dead in my tracks. The woman who stood before me looked remarkably familiar, but I could not for the life of me figure out why. She looked back at me with what I can only imagine was a perfect mirror of the perplexed expression I was wearing.

+

+I know you. But why?” we asked in near unison.

+

+ After rooting around in our past lives a bit, we realized that we had met at SXSW nearly a decade earlier when we were both journalists covering the music festival. Not only that, but we had met through a mutual friend… Rey Roldan (a pivotal figure in my first story).

+

+ Mystery solved, we filled each other in on what we’d been up to since we’d last met and how it was we both had come to work on the web. Ms. Jen was incredibly interested in the future of mobile photography. In 2005, she was running around snapping photos on her Nokia 7610. She was always ahead of the curve, realizing the latent potential of mobile while most of us were still grumbling about IE6.

+

+ Ms. Jen had been coming to SXSW for quite some time and got to know the staff at the Hampton, who routinely hooked her up with one of the suites meaning she had a couch, a coffee table, and a ’fridge… three important facilities if you plan on hosting a wine & cheese party. Which is exactly what she and some friends decided to do in 2006.

+
+
+

+ Jon Hicks, Veerle Pieters, and Kenneth Himschoot at the inaugural wine & cheese party.
+Photo credit: Jenifer Hanen.

+
+

+ When I arrived at the party, I was greeted by Ms. Jen playing the attentive hostess. I was given a glass of wine and plopped myself down on the floor and introduced myself to the little group Jen had gathered. It was a small group, but the conversations were fantastic and I met a number of amazing individuals whose friendships I value tremendously: Kenneth Himschoot, Lauren Isaacson, Chris MillsVeerle PietersJessica Spengler, and Steph Troeth.

+

+ Ms. Jen’s wine & cheese parties quickly became a staple of our annual pilgrimage to SXSW Interactive. Each year, more people came until the crowd got so large you literally could not fit another human being in the room. Standing room only… including on top of the bed and some of the other pieces of furniture. The room would be filled with incredible people you wanted to see and interact with, but was also overcrowded and uncomfortable.

+

+ In a lot of ways, Ms. Jen’s wine & cheese parties were mirroring what was happening with the festival as a whole. But that’s another story for another day.

+
+

+It’s hard to believe that SXSW Interactive is 20 years old. Reading through the remarks and stories in this awesome piece from Fast Company, I felt inspired to share some of my SXSW stories. This is the third.

diff --git a/export/2014-03-02-paper-dolls.md b/export/2014-03-02-paper-dolls.md new file mode 100644 index 0000000..9f54813 --- /dev/null +++ b/export/2014-03-02-paper-dolls.md @@ -0,0 +1,58 @@ +--- +title: "Paper Dolls" +date: 2014-03-02 10:15:00 +comments: true +tags: + - "conferences" + - "culture & society" + - "humor" +description: "Jon Hicks came to SXSW in 2005 and made quite an impression on many of us. It’s no surprise, he’s an incredibly nice chap." +permalink: /archives/paper-dolls/ +--- + +

+Jon Hicks came to SXSW in 2005 and made quite an impression on many of us. It’s no surprise, he’s an incredibly nice chap.

+

+ Well, when Jon could not make it to the festival in 2006, Glenda Sims filled the void with a paper doll named Flathicks. The idea was that, through Flathicks, Jon would be able to be photographed at parties, talks, and the like, so it’d be like he was there.

+
+
+

+ Shaun Inman, Flathicks and Jason Santa Maria.
+Photo credit: Brian Warren

+
+

+ Flathicks quickly took on a life of his own and his adventures went far beyond SXSW. We handed him off, from person to person, as we traveled around the world, back to our homes, to other conferences, etc. In fact, he joined Kelly & me in Sydney, Australia in late 2007 at Web Directions South and then flew back with us to San Francisco to attend An Event Apart. You can follow his adventures on Flickr or even check out his personal Flickr account.

+
+

+ Kelly has had a bit of bad luck when it comes to SXSW.

+

+ The first year she went, 2006, she got food poisoning just as we boarded the plane to go to the festival. She ended up spending the majority of her time in our hotel room, recovering, and missed most of the festivities. She did manage to muster enough strength to attend the conference for a bit to see Jeremy Keith & me deliver “How to Bluff Your Way in DOM Scripting”, attend the first Web Standards Project meeting, and then attend a party or two, so the trip was not a complete bust for her. But it wasn’t nearly as enjoyable as it should have been.

+

+ Kelly planned to return with me in 2007, but caught the flu about a week before and had to cancel. SXSW is a well-renown incubator of illness (“Southby Scurvy” as we affectionately call it) and Kelly did not want to be patient zero that year, so she bowed out.

+

+ Coincidentally, Jeremy’s wife Jessica was unable to make it in 2007 either. So Glenda, being the incredibly sweet woman that she is, made us paper dolls of Kelly and Jessica. She dubbed them “Kellydoll” and “Jessidoll” as she didn’t feel “flat” was appropriate as part of a woman’s nickname.

+
+
+

+ Kellydoll and Jessidoll attending the appropriately-named “Flatstock” (a poster festival).

+
+

+ As with Flathicks, Kellydoll & Jessidoll had amazing adventures at the festival and had their photo taken with everyone who missed them.

+

+ I carried Kellydoll with me everywhere, her head poking out of my backpack. This turned out to be a bad idea however as it facilitated her escape. I must have been boring her.

+
+
+

+ Kellydoll enjoying a rib plate at Ironworks.

+
+

+ I realized Kellydoll was missing on the last night of SXSW, on the way back from dinner at Ironworks, where she had been the subject of a few photos. It was raining, so I had been rushing back to my hotel room when I saw another potential opportunity to photograph her. When I reached back to get her, I realized she was gone.

+

+ I spent the next hour and a half combing Ironworks, Red River Road, First Street and a few of the other lanes around the Convention Center, retracing my steps, looking for any sign of Kellydoll. I came up empty-handed & sulked back to the hotel, depressed that I’d lost my little paper wife.

+

+ I never did find Kellydoll, so she and the real Kelly never got to meet. Maybe she’s still out there having adventures. Or maybe she ran off with Flathicks when my back was turned.

+

+ I never trusted that guy.

+
+

+It’s hard to believe that SXSW Interactive is 20 years old. Reading through the remarks and stories in this awesome piece from Fast Company, I felt inspired to share some of my SXSW stories. This is the third.

diff --git a/export/2014-04-01-adaptive-design-empathy-and-beating-creative-block.md b/export/2014-04-01-adaptive-design-empathy-and-beating-creative-block.md new file mode 100644 index 0000000..ddedba4 --- /dev/null +++ b/export/2014-04-01-adaptive-design-empathy-and-beating-creative-block.md @@ -0,0 +1,28 @@ +--- +title: "Adaptive Design, Empathy & Beating Creative Block" +date: 2014-04-01 14:33:00 +comments: false +tags: + - "browsers" + - "design" + - "mobile" + - "presentations" + - "progressive enhancement" + - "web standards" +description: "Lately I’ve been doing a few more podcasts, local events, and interviews. Here’s a round-up of a few that posted in the last week or so:" +permalink: /archives/adaptive-design-empathy-and-beating-creative-block/ +--- + +

+ Lately I’ve been doing a few more podcasts, local events, and interviews. Here’s a round-up of a few that posted in the last week or so:

+

+The modern.IE Podcast
+ Josh Holmes interviewed me about adaptive design, progressive enhancement, and a wide range of other things.

+

+PKNCHA#15
+ I did a PechaKucha talk on empathy and the golden rule last December. The video is now on YouTube.

+
+
+

+15 Pro Techniques for Beating Creative Block
+ I give my thoughts on the topic, but you can also read the fantastic thoughts of Dan Rubin, Trent Walton, Derek Featherstone, and Rachel Shillcock.

diff --git a/export/2014-04-15-beyond-responsive-workshops-this-may.md b/export/2014-04-15-beyond-responsive-workshops-this-may.md new file mode 100644 index 0000000..c397db7 --- /dev/null +++ b/export/2014-04-15-beyond-responsive-workshops-this-may.md @@ -0,0 +1,37 @@ +--- +title: "Beyond Responsive Workshops this May" +date: 2014-04-15 14:39:00 +comments: false +tags: + - "Code & Creativity" + - "business" + - "coding" + - "conferences" + - "presentations" + - "progressive enhancement" +description: "A lot fo my work lately has been consulting and working with teams to help them to establish or improve upon their responsive strategies." +permalink: /archives/beyond-responsive-workshops-this-may/ +--- + +

+ A lot fo my work lately has been consulting and working with teams to help them to establish or improve upon their responsive strategies.

+

+ I love this sort of work and I live for helping teams and individuals tackle the thorny issues (in code or processes) that make responsive projects a challenge. For a lot of small companies, it can be a challenge to pull ogether enough budget to fly me in for a few days of private working sessions, which is why I am such a huge fan of running public workshops… especially über-affordable ones like I am leading this May.

+

+ The first will be on my home turf in Chattanooga, Tennessee on May 2nd and I will be co-leadiing the workshop with my esteemed colleague Brad Frost. It’s the first workshop from our successful Code & Creativity event series and should be a heck of a lot of fun. There are a few tickets left for $399 each on Eventbrite.

+

+ The second workshop will be in Düsseldorf, Germany on May 21st. There are a handful of tickets still available for €349 (VAT included) and a ticket also gets you into the incredible Beyond Tellerrand conference which runs on the 19th & 20th. I’ve spoken at this conference twice before and it is one of only a handful of events in the world I enthusiatically recommend attending.

+

+ Here’s a rough idea of what I’ll be covering in the two workshops:

+
+

+ Responsive web design has taken our industry by storm and with good reason: it helps us improve our reach with less effort. But incorporating responsive design is not the goal, meeting our user’s needs is. Responsive design is not an end in itself… it’s just the beginning.

+

+ We need to embrace the heterogenous nature of the web—myriad web-enabled devices with vastly different dimensions, screen sizes, networks, and capabilities in use by countless individuals, each with their own special needs—and craft experiences that will work anywhere at any time. We need to build robust systems that adapt in ways far beyond aesthetics.

+

+ Each workshop with a discussion of a number of considerations that we should be aware of, beyond screen size and pixel density, and provide examples of how to adapt our interfaces so they rise to meet our customers’ needs. Then he’ll turn it over to you to propose gnarly design and/or interface challenges you are struggling with. Once everyone’s challenges are collected, attendees will be given the opportunity to form small groups around each and you will spend a portion of the day working on solutions while Aaron mentors each group and pushes you to think more about accessibility, alternate interaction methods, slow networks, and other considerations.

+

+ The workshop will wrap up with brief presentations from each group followed by a an open question and answer session.

+
+

+ I hope you’ll join me in Chattanooga or Düsseldorf next month. Bring your questions and your challenges and let’s dig in.

diff --git a/export/2014-07-20-the-native-vs-stylable-tug-of-war.md b/export/2014-07-20-the-native-vs-stylable-tug-of-war.md new file mode 100644 index 0000000..a05987c --- /dev/null +++ b/export/2014-07-20-the-native-vs-stylable-tug-of-war.md @@ -0,0 +1,76 @@ +--- +title: "The “Native” vs. “Stylable” Tug of War" +date: 2014-07-20 08:14:00 +comments: true +tags: + - "(x)HTML" + - "books & articles" + - "browsers" + - "client relations" + - "CSS" + - "design" + - "usability" + - "web standards" +description: "In his astute post “ ‘Native experience’ vs styling select boxes ” , Bruce Lawson correctly identified a common tension in the web world:" +permalink: /archives/the-native-vs-stylable-tug-of-war/ +--- + +

+ In his astute post ‘Native experience’ vs styling select boxes, Bruce Lawson correctly identified a common tension in the web world:

+
+

+ But why this urge to re-style page elements that end-users are familiar with? … Or is it that we love native look and feel, except when we don’t?

+
+

+ Speaking as the guy who not only wrote JavaScript to help me create an accessible select element alternative, but who also made it the focus of a case study (image) in AdvancED DOM Scripting, I am fully aware of the desire to have it both ways. I have not often seen the desire for both in a single individual, but it does happen in one particular instance occasionally.

+

+ Based on my own experience, I see the following arguments in favor of changing the display of a native control quite often:

+
    +
  1. + It doesn’t look good to me.
  2. +
  3. + It is not “on brand”.
  4. +
  5. + It clashes with our brand’s color scheme.
  6. +
  7. + We want the web experience to feel like a native app.
  8. +
  9. + It doesn’t behave how we think it should.
  10. +
+

+(n.b. Browsers have done a pretty good job reducing the amount of color and the overall visual strength used in native controls to help them better blend in with a wide variety of designs, so clashes as mentioned in #3 happen far less often than they did nearly a decade ago.)

+

+ As the weathered, battle tested (and, admittedly, somewhat jaded) front-end dev that I am, I typically push back with one or more of the following:

+

+ In Addressing Desired Design Changes

+

+ In terms of aesthetics (addressing arguments 1, 2, and 3), I understand where you’re coming from. Native controls are not the most appealing things, but they are familiar to your users. A select box they see on your site that looks like the one they see on Wikipedia or their banking site will be immediately recognizable. Sure, the looks and feel may differ from browser to browser, but most people use only a small number of browsers throughout the day—at work, at home, on their device—and if you want to ensure the design of a form control feels “right” in the browser they are using, sometimes it’s best to let go of that control.

+

+ In Addressing OS Parity

+

+ I can understand the desire to have a form control in a web page look and feel like the same (or a similar) control within the native operating system (argument 4), but I am not sure that’s a rabbit hole you want to go down. Here’s why: Achieving exact design and functional parity between a native control and a web control quite often requires extra markup, a bunch of CSS, and a bit of JavaScript. Anything is achievable with unlimited time and budget, so it’s completely doable, but it would be good to estimate the cost to see if you still think it is a worthwhile endeavor.

+

+ Assuming it is, we then have the question of which operating system to model the control after. Or maybe you want to offer a different take on the control based on the operating system your user is using. In that case, we may need to multiply the original estimate by the number of operating systems you want to support. But it’s worth noting that, in the Android world, different device manufacturers often “skin” the operating system to look different from other ones. Sometimes they even do it on a device-by-device basis. We’ll need to figure out which ones you want to include in your native control matrix and multiply the estimate accordingly.

+

+ Then there’s maintenance. We’ll need to test these native-like controls on each of their corresponding platforms and test the script that determines which experience gets delivered to which device to make sure we’re not accidentally sending the wrong experience. We’ll also need to test the delivery script on every other browser in our test matrix to ensure it is not causing issues there.

+

+ What should we do when a new operating system version is rolled out? iOS, for example, has made radical shifts in the design of their native controls in each major release. We’ll probably want to create unique versions of the control for each version of each OS we support and we’ll need to keep tabs on upgrades so we don’t end up confusing our users if they visit our site in iOS 7 and have a control that looks like it’s from iOS 6. We’ll need to add the number of OS versions into the multiplier as well.

+

+ Ok, and finally: How many controls did you want to apply this approach to again?

+

+ Or we could use the native form control and it will just work.

+

+ In Addressing Altered Behavior

+

+ I completely agree that not all native controls work exactly how I would like, but there are several risks in changing the expected behavior of a native control.

+

+ First of all, there’s the possibility we could actually end up making the interface more confusing or that the change in behavior might not be exactly what our user’s wanted (either based on what they are used to or our mental model not aligning with theirs). In order to rule out these issues, we should run a few rounds of usability tests. These could be quick café tests or more formal studies depending on the budget.

+

+ Assuming our tests go well, we will need to maintain this code and do all of the requisite browser testing. And potentially upgrade our code as new browsers and browser versions come out. Depending on the complexity of the code, this could become a large requirement, but if it is ultimately in the service of making the web a better, more usable interaction environment, it could be worth it.

+

+ For what it’s worth, if we go this route and are successful, we should consider getting involved in the spec-writing process at the W3C or WhatWG. We should contribute our recommended changes back to the community and share what we learned. If we make a compelling argument, perhaps our idea will become part of some future standard and we can taper off our browser testing when the change goes native.

+
+

+ As you can probably tell, I’m not a really big fan of changing existing controls as I feel it can amount to a wasted effort. That said, if there are design improvements to be made—“design” in the true sense: being about how usable something is, not just how aesthetically-pleasing it is to someone (e.g. improving contrast, making the control more intuitive, etc.)—I’m willing to accept the change as something we should do and then work to make sure that change has been vetted and, if successful, given away for inclusion in other projects. If it solves a major issue on the web, I want to give that change every opportunity to make it into the appropriate spec by talking to the appropriate folks about it both in-person, in blog posts, and on the appropriate mailing list. If the change solves a problem in a specific browser, I want to see it incorporated into said browser and will file a bug report and try to build momentum around it by engaging the community.

+

+ Anyway, that’s my general position on augmenting native controls. What are your thoughts on the topic?

diff --git a/export/2014-09-22-responsive-typography.md b/export/2014-09-22-responsive-typography.md new file mode 100644 index 0000000..6ff2b57 --- /dev/null +++ b/export/2014-09-22-responsive-typography.md @@ -0,0 +1,46 @@ +--- +title: "Responsive Typography" +date: 2014-09-22 10:19:00 +comments: false +tags: + - "books & articles" + - "CSS" + - "design" + - "mobile" + - "optimization & performance" + - "progressive enhancement" + - "usability" + - "web standards" +description: "I’m incredibly excited to see that Jason Pamental ’s fantastic Responsive Typography is finally available. I had the great pleasure of reviewing an early galley and I can honestly say that it’s a book well worth reading. Jason’s natural..." +permalink: /archives/responsive-typography/ +--- + +

+ I’m incredibly excited to see that Jason Pamental’s fantastic Responsive Typography is finally available. I had the great pleasure of reviewing an early galley and I can honestly say that it’s a book well worth reading. Jason’s natural and friendly style makes for an easy read and it’s chock-full of actionable recommendations and tips you’ll want to start using right away.

+

+ In fact, I think Responsive Typography is such an invaluable book, I offered to write the Foreword and Jason (and O’Reilly) have been kind enough to let me reprint it here:

+
+

+ Back in my day, all we had was the font element.

+

+ I fully realize that makes me sound like an old man, but I’m not ready to chase young whippersnappers off my lawn quite yet. But the fact remains that when I taught myself how to build web pages back in the mid-’90s, our design options were fairly limited. Heck, my first experience on the Web was on a text-based browser that provided me access to page upon glorious page of stark, blocky Courier. White text. Black background. 100% responsive.

+

+ When visual browsers finally hit the scene, ushering in images and the font element, we web designers finally had the opportunity to move out of monospace. I’ll leave it to Jason to delve into the history of typography on the Web, but the advent of visual browsers opened the floodgates for use (and abuse) of type online. It was the desktop publishing revolution all over again: a direct assault on the sensibilities of anyone with even the slightest understanding of typography.

+

+ Over the years, we’ve made a lot of mistakes with web type: Fonts embedded in images. Fonts embedded in Flash. Fonts embedded in JavaScript. Many of those were attempts to bypass the gridlock created by browser makers, type foundries, and the W3C, who couldn’t come to a consensus on how to balance a desire for more type options on the Web while ensuring typographers got paid for all of their hard work. While they bickered, we soldiered on, looking for more accessible and maintainable ways to use more typefaces.

+

+ And while we were busy tinkering with sIFR and Cufón, eagerly awaiting the day we could abandon those hacks and have real browser support for actual font formats, an army of little black rectangles had caught a whiff of the awesome content we were serving up to desktop browsers.

+

+ Like ants at a Sunday picnic, these little black rectangles initially appeared one or two at a time. They were easily ignored, a nuisance. Nothing to take too seriously. But before we knew what was happening, that trickle turned into a flood and those little rectangles were hungry. Instead of taking a crumb here and there—which we tossed to them with a great sense of self-satisfaction—these ambitious ants were carrying off whole deli trays and the friggin’ New York Times.

+

+ These little black rectangles are, of course, the surge of handheld (or at least hand-holdable) devices that have been redefining our concept of “the Web” almost daily. They exhibit widely variable screen sizes: from about the size of a matchbook, to ones that are bigger than an extra large pizza. They sport a plethora of pixel densities, new interaction methods, unpredictable network connection speeds, and low-powered processors that can’t possibly compete with traditional laptop and desktop CPUs (not to mention a handful of different operating systems and browsers). All of these factors affect how—and even whether—your typographic choices will make it to your customers, and it’s a lot to take in.

+

+ Thankfully, Jason has your back. The book you’re now reading is invaluable: it’s chock-full of useful approaches, practical code samples, and advice for dealing with typography in the age of responsive web design.

+

+ By the time you finish this brief book, you’ll be ready to handle pretty much any device someone may throw at you. But hopefully they won’t. Devices are hard. And expensive.

+

+ — Aaron Gustafson
+    Author, Adaptive Web Design

+
+

+ By the way, if you’re on a typography kick, I’ll also recommend an excellent new book by another Jason I respect greatly: Jason Santa Maria’s On Web Typography. The two books books compliment each other perfectly, with very little overlap. They’d make an awesome bundle.

diff --git a/export/images/a-load-of-malarkey/a_lot_of_malarkey_with_ie1.png b/export/images/a-load-of-malarkey/a_lot_of_malarkey_with_ie1.png new file mode 100644 index 0000000..6adee26 Binary files /dev/null and b/export/images/a-load-of-malarkey/a_lot_of_malarkey_with_ie1.png differ diff --git a/export/images/apple-vs-the-open-web/ios-maps-transit.png b/export/images/apple-vs-the-open-web/ios-maps-transit.png new file mode 100644 index 0000000..61e7ddd Binary files /dev/null and b/export/images/apple-vs-the-open-web/ios-maps-transit.png differ diff --git a/export/images/automatically-opting-in-to-ie-standards-mode/wasp-logo.png b/export/images/automatically-opting-in-to-ie-standards-mode/wasp-logo.png new file mode 100644 index 0000000..156b105 Binary files /dev/null and b/export/images/automatically-opting-in-to-ie-standards-mode/wasp-logo.png differ diff --git a/export/images/evernote-for-interface-inventories/interface-inventory-evernote-inventory.png b/export/images/evernote-for-interface-inventories/interface-inventory-evernote-inventory.png new file mode 100644 index 0000000..386320d Binary files /dev/null and b/export/images/evernote-for-interface-inventories/interface-inventory-evernote-inventory.png differ diff --git a/export/images/evernote-for-interface-inventories/interface-inventory-screen-snapping.png b/export/images/evernote-for-interface-inventories/interface-inventory-screen-snapping.png new file mode 100644 index 0000000..0d3d3ef Binary files /dev/null and b/export/images/evernote-for-interface-inventories/interface-inventory-screen-snapping.png differ diff --git a/export/images/evernote-for-interface-inventories/interface-inventory-skitch-prefs.png b/export/images/evernote-for-interface-inventories/interface-inventory-skitch-prefs.png new file mode 100644 index 0000000..e32c9c6 Binary files /dev/null and b/export/images/evernote-for-interface-inventories/interface-inventory-skitch-prefs.png differ diff --git a/export/images/evernote-for-interface-inventories/interface-inventory-skitch-screenshot.png b/export/images/evernote-for-interface-inventories/interface-inventory-skitch-screenshot.png new file mode 100644 index 0000000..2b4a16a Binary files /dev/null and b/export/images/evernote-for-interface-inventories/interface-inventory-skitch-screenshot.png differ diff --git a/export/images/evernote-for-interface-inventories/interface-inventory-skitch.png b/export/images/evernote-for-interface-inventories/interface-inventory-skitch.png new file mode 100644 index 0000000..82d0c59 Binary files /dev/null and b/export/images/evernote-for-interface-inventories/interface-inventory-skitch.png differ diff --git a/export/images/evernote-for-interface-inventories/interface-inventory-web-clipper.png b/export/images/evernote-for-interface-inventories/interface-inventory-web-clipper.png new file mode 100644 index 0000000..e59aa46 Binary files /dev/null and b/export/images/evernote-for-interface-inventories/interface-inventory-web-clipper.png differ diff --git a/export/images/experimenting-with-grids-using-ecsstender/the-grid-system.png b/export/images/experimenting-with-grids-using-ecsstender/the-grid-system.png new file mode 100755 index 0000000..087c071 Binary files /dev/null and b/export/images/experimenting-with-grids-using-ecsstender/the-grid-system.png differ diff --git a/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-mid.png b/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-mid.png new file mode 100644 index 0000000..2bd13e8 Binary files /dev/null and b/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-mid.png differ diff --git a/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-small.png b/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-small.png new file mode 100644 index 0000000..477f525 Binary files /dev/null and b/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-small.png differ diff --git a/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-tiny.png b/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-tiny.png new file mode 100644 index 0000000..1f8635a Binary files /dev/null and b/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-tiny.png differ diff --git a/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-wide.png b/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-wide.png new file mode 100644 index 0000000..a7e3c6d Binary files /dev/null and b/export/images/from-mobile-friendly-to-mobile-first/blog-comparison-wide.png differ diff --git a/export/images/hallways/hallways.jpg b/export/images/hallways/hallways.jpg new file mode 100644 index 0000000..88e6da9 Binary files /dev/null and b/export/images/hallways/hallways.jpg differ diff --git a/export/images/i-finally-wrote-a-book/adaptive-web-design-stack.jpg b/export/images/i-finally-wrote-a-book/adaptive-web-design-stack.jpg new file mode 100755 index 0000000..49d4f25 Binary files /dev/null and b/export/images/i-finally-wrote-a-book/adaptive-web-design-stack.jpg differ diff --git a/export/images/i-missed-it/WDiaN.gif b/export/images/i-missed-it/WDiaN.gif new file mode 100644 index 0000000..2e3a96f Binary files /dev/null and b/export/images/i-missed-it/WDiaN.gif differ diff --git a/export/images/i-wish-id-known-that/sexy-presentations.jpg b/export/images/i-wish-id-known-that/sexy-presentations.jpg new file mode 100644 index 0000000..f8b1b51 Binary files /dev/null and b/export/images/i-wish-id-known-that/sexy-presentations.jpg differ diff --git a/export/images/on-redirecting-mobile-traffic/costco-redirect.png b/export/images/on-redirecting-mobile-traffic/costco-redirect.png new file mode 100755 index 0000000..360d58c Binary files /dev/null and b/export/images/on-redirecting-mobile-traffic/costco-redirect.png differ diff --git a/export/images/png-color-oddities-in-ie/png_compare.png b/export/images/png-color-oddities-in-ie/png_compare.png new file mode 100644 index 0000000..8a0878e Binary files /dev/null and b/export/images/png-color-oddities-in-ie/png_compare.png differ diff --git a/export/images/retreat-remembered/retreat-1-group.jpg b/export/images/retreat-remembered/retreat-1-group.jpg new file mode 100755 index 0000000..792029e Binary files /dev/null and b/export/images/retreat-remembered/retreat-1-group.jpg differ diff --git a/export/images/say-what-you-mean/aitErrorMsg.jpg b/export/images/say-what-you-mean/aitErrorMsg.jpg new file mode 100644 index 0000000..48094ec Binary files /dev/null and b/export/images/say-what-you-mean/aitErrorMsg.jpg differ diff --git a/export/images/speeding-up-your-code-with-the-bitwise-operator/binary.gif b/export/images/speeding-up-your-code-with-the-bitwise-operator/binary.gif new file mode 100644 index 0000000..dfc8228 Binary files /dev/null and b/export/images/speeding-up-your-code-with-the-bitwise-operator/binary.gif differ diff --git a/export/images/template-based-asset-munging-in-expressionengine/2010-07-11-style-templates.png b/export/images/template-based-asset-munging-in-expressionengine/2010-07-11-style-templates.png new file mode 100644 index 0000000..bdc9e91 Binary files /dev/null and b/export/images/template-based-asset-munging-in-expressionengine/2010-07-11-style-templates.png differ diff --git a/export/images/wait-for-it/please_wait_t.jpg b/export/images/wait-for-it/please_wait_t.jpg new file mode 100644 index 0000000..d317639 Binary files /dev/null and b/export/images/wait-for-it/please_wait_t.jpg differ diff --git a/export/images/you-cant-rely-on-javascript/lala-fail.png b/export/images/you-cant-rely-on-javascript/lala-fail.png new file mode 100755 index 0000000..aa8ec9d Binary files /dev/null and b/export/images/you-cant-rely-on-javascript/lala-fail.png differ diff --git a/export/images/you-cant-rely-on-javascript/lifehacker-fail.png b/export/images/you-cant-rely-on-javascript/lifehacker-fail.png new file mode 100755 index 0000000..974a9ab Binary files /dev/null and b/export/images/you-cant-rely-on-javascript/lifehacker-fail.png differ diff --git a/export/images/zoom-layouts-v2/drink-drive-lose-ad-challenge-small.png b/export/images/zoom-layouts-v2/drink-drive-lose-ad-challenge-small.png new file mode 100644 index 0000000..455bb41 Binary files /dev/null and b/export/images/zoom-layouts-v2/drink-drive-lose-ad-challenge-small.png differ diff --git a/export/images/zoom-layouts-v2/drink-drive-lose-ad-challenge.png b/export/images/zoom-layouts-v2/drink-drive-lose-ad-challenge.png new file mode 100644 index 0000000..0f5f3bf Binary files /dev/null and b/export/images/zoom-layouts-v2/drink-drive-lose-ad-challenge.png differ