June 2004 - Posts

A Google of a Charitable Idea
21 June 04 01:29 PM | Scott Mitchell

For those that don't know, Google offers a set of contextually targeted text-based and image-based ads that you can serve from your Web site by adding just a few lines of client-side JavaScript. This program, dubbed Google AdSense, allows publishers of online content to easily make a buck or two from their site's traffic without having to go through the hassle of creating/buying ad-serving software or soliciting and billing advertisers. The publisher gets a few cents each time a visitor clicks on a Google Ad. ScottOnWriting.NET has a Google Ad banner up at the top, for example.

While Google's AdSense program is easy to get setup, only sites with heavy traffic will make significant revenue. Most small sites, like my blog, can only expect to make a few cents a day. I have contemplated taking down the Google Ads on my blog, figuring the clutter on my blog wasn't worth the thirty or so dollars I expected to make with a full year of Google Ads, but I'm still evaluating the program... we'll see. It occurred to me, though, that there are likely other bloggers and low-traffic sites out, each making a few cents a day as well. While alone that amount of money is inconsequential, pooled this collective income could really add up. It then hit me - why not create a single Google AdSense account that anyone could use, and then donate the income generated to a charity?

Sounds like a great idea, no? Unfortunately, it is in direct violation of Google AdSense's Terms and Conditions. Specifically, Rule 5, Section (vi) states:

You shall not, and shall not authorize or encourage any third party to: directly or indirectly access, launch and/or activate Ads through ... any ... Web site or other means other than Your Site(s)...

So creating one pooled account from which any number of small sites could use to collectively pool the revenue generated through click throughs would be against Google AdSense's policies. To be on the safe side, I emailed the Google AdSense support desk and asked them if some exception could be made for such a charitable purpose. Unfortunately, the response was in the negative:

Unfortunately, as you mentioned, publishers may not publish ads on pages that they do not own or have control over. We appreciate your suggestion and encourage you to continue to let us know how we can improve Google

It's too bad Google's not interested in such a scenario. I think it would be a wild success since there are God knows how many sites out there that fit this bill: too small to generate serious scratch, but who would be interested in helping in donating a few cents, especially to do so all they'd have to do is add a few lines of JavaScript code. Attention Google: you're missing a great PR opportunity! :-)

Filed under:
Confirming Before Postback
18 June 04 02:35 PM | Scott Mitchell

There are a number of articles out there about building a DataGrid with a ButtonColumn of Delete buttons, and using a bit of client-side JavaScript in the Button's client-side onclick event handler in order to present the user with a confirm popup asking them if they're sure they really want to delete the record. For example, such an article can be found in An Extensive Examination of the DataGrid Web Control: Part 8. Essentially you add the onclick event handler to the Button's Attributes collection:

button.Attributes[“onclick“] = “javascript:return confirm(...);“;

I was working on a project recently where a client wanted similar functionality, but with a bit of a twist. He had a DropDownList populated with some “parent” values, with the parent's details displayed in the Web page. Changing the DropDownList item would cause the page to be posted back and the page loaded with the newly selected DropDownList's details records. Simple enough to implement: just add a DropDownList with the AutoPostback property set to True, and write an event handler for the SelectedIndexChanged event.

However, under certain conditions, when the user changed the DropDownList item, this client wanted the user to be prompted with a client-side confirm box, asking them if they were sure they wanted to leave the current details information. (Imagine, for instance, that the details for a particular master record had some “critical” tasks that needed to be done, and if these had not been checked off, a warning would need to be displayed saying something like, “Are you sure you want to leave, you still have critical tasks left unchecked.”)

To accomplish this, I needed to add an onchange client-side event handler for the DropDownList. However, the DropDownList already emits some client-side script in the onchange event - namely the code to initiate a postback. That is, the DropDownList, with AutoPostback set to True, renders like:

<select onchange=”__doPostBack(...)”>

Adding an onchange event to the DropDownList's Attributes collection prepends whatever you add to the __doPostBack(...) script emitted. So, we want to add script that calls __doPostBack(...) only if the user hits OK in the confirm popup. The following will accomplish this:

ddl.Attributes[”onchange”] = “if (confirm(...)) “;

With this client-side script, the DropDownList will be rendered as:

<select onchange=”if (confirm(...)) __doPostBack(...)”>

And that's what we want! Now, if the user changes the DropDownList value, it will popup a client-side confirm box, saying, “Are you sure you want to change the master record?” (or whatever). If the user clicks OK, it will postback and update the detail view. If they click cancel, the postback will not occur, and the user will remain on the same page.

This approach, though, is not complete, I'm afraid. If the user clicks cancel it's true that the page won't be posted back, but the DropDownList will display the value of the selected list item. That is, if the user is viewing the master record X, and then change to Y, they'll see that confirm box. If they click cancel, a postback won't occur, but the DropDownList will show Y (not X).

One approach to fix this is to add a bit of client-side code to the page that records the DropDownList's selected index on the body's onload event. Then, a client-side function can be created that resets the DropDownList's selected index back to the original one. This function would be called if the user clicked cancel. To get everything to work, the onchange script needs to be changed to:

ddl.Attributes[”onchange”] = “if (!confirm(...)) resetIndexes(); else “;

With this client-side script, the DropDownList will be rendered as:

<select onchange=”if (!confirm(...)) resetIndexes(); else __doPostBack(...)”>

So if the user clicks Cancel on the confirm, the client-side resetIndexes() function will be called, which will reset the DropDownList index. Otherwise, if they clicked OK, the postback will occur. The final piece of the puzzle is the code for the resetIndexes() function and the function that records the original selected index values in the onload event. Assuming your DropDownList's client ID is foo, this could be accomplished with the following client-side code:

var fooIndex;

function saveIndexes()
fooIndex = document.frmName.foo.selectedIndex;

function resetIndexes()
document.frmName.foo.selectedIndex = fooIndex;

This assumes that you have named your Web Form frmName. (<form runat=”server” id=”frmName”>). Also, you'll need to call saveIndexes() in the body onload event - <body onload=”saveIndexes();”>.

Hope this helps someone... :-)

Filed under:
Managing Spam When On Vacation
16 June 04 10:41 AM | Scott Mitchell

Next week I will be embarking on an 11 day honeymoon in Europe with my new wife and have decided to completely unplug myself - no Internet, no email. One concern I'm faced with, though, is how will me email account survive without me pruning the leagues of spam from my Inbox? Every morning when I first logon, Outlook downloads 300+ messages, with 85%+ of them being spam. Of course, I don't see those spammy messages, SpamBayes does a wonderful job of scuttling them to the Spam Folder for me automatically, but my delimma is as follows: assume I get 5 MBs worth of email a day, with say 500 KB being non-spam. If I am gone for 11 days that's 55 MB of email (5.5 MB of email I care about). Now, say that my ISP limits my inbox to 10 MB, or even 25 MB. After a week or so, my inbox will be bouncing valid emails because it's overstuffed with spam.

Does anyone see a solution to this other than leaving Outlook running for the duration of my honeymoon? Yeah, yeah, I could use a service like Yahoo Mail (now with 100 MB inboxes) or my new GMail account (thanks, Ambrose!), but I have accounts with limited inbox quotas that receive the lion's share of spam and important emails... Has anyone encountered this problem before? What solution did you employ?

Filed under:
TechEd Wrapup
09 June 04 02:33 PM | Scott Mitchell

I've been meaning to write this wrap-up on TechEd 2004 for the past week and a half, but things have been pretty hectic as of late. Not only have I had a couple of consulting projects going on, but I also have two classes I'm teaching, and was out of town on business most of last week. But the real time consumer has been those pile of last minute activities for this weekend. See, I'm a gettin' hitched on Saturday to my girlfriend of the past three and a half years. While we have been planning this as far back as August 2003, needless to say, there are a ton of last-minute preparations and details that need attending to. And right now is the calm before the storm. Out-of-town family and friends start arriving late tonight, tomorrow, and Friday, and our place - our tiny 1,000 sq. ft. condo - will be host to a couple of guests both tomorrow night and Friday night. But enough about that, back to TechEd...

This year's TechEd was the first I've ever attended. I've been to a number of conferences in the past, but most were small and ASP/ASP.NET-focused, and have never been anywhere near in size to the 12,000 folks who piled in for TechEd 2004. Needless to say, tbe scale of everything was simply awe inspiring. At breakfast there were dozens, if not hundreds, of 2'x1' trays piled six inches deep with bacon (or a product pretending to be bacon - I couldn't decide). Just seeing row after row of those little rubbery meat-flavored slabs was impressive enough, and made my entire TechEd experience. I couldn't help but imagine what the kitchen area must be like. I assume there was a team of men doing nothing but making bacon. There may have even been someone to manage that team, a bacon manager, if you will.

Having never been to a large Microsoft conference before, I was unaware of how much food, and how often, they provided. Besides having gobs of food at each breakfast, lunch, and dinner, between each session there was always fruit, bagels, and an assortment of deserts - fruit bars, brownies, cake, popsicles, Dove bars, candy, candy bars, and so on. Just volumes and volumes of it. Needless to say, it was hard to eat healthy during the conference, but what fun it eating healthy anyway?

I did manage to go to some of the sessions, between feeding periods, and have blogged about a couple of them. I went bright and early to Michele Leroux Bustamante's Building Applications with Globalization in Mind. Also, I really enjoyed Rob Howard's Black-Belt ASP.NET talk. But the best part of TechEd was not to be found in the sessions, nor in the cafeteria, but rather in the Cabanas. The Cabanas were in a large, football field-sized room, surrounding hundreds of public, Internet-connected computers, and were staffed with Microsoft employees who were all too willing to play XBOX with you. Seriously, though, they were a great place to ask questions and get answers. In fact, I was even queried once or twice for ASP.NET-related questions and dutifully turned over the reigns of XBOX to accommodate.

I also got to participate in a book signing on behalf of Sams Publishing. The downside was that it was hosted right at the lunch hour, so turnout was light, but I did get to sign a couple of copies of Teach Yourself ASP.NET in 24 Hours. (Unfortunately, there was a shipping mixup so my ASP.NET Data Web Controls Kickstart books were not available for purchase during TechEd...)

I'm very glad I went to TechEd this year, it was a truly rewarding experience. I got to spend some face time with people I only get to communicate with electronically, got to hear some interesting talks, ate too much, and got my butt kicked in Halo. And all within 10 miles from my front door.

How DOES a Computer Novice Upgrade?
03 June 04 02:12 PM | Scott Mitchell

I recently stumbled across Microsoft's Sacred Cash Cow which examines Microsoft's reliance on sales of Windows and Office to drive their revenues, and how Microsoft has made poor decisions in the past in order to protect their two cash cows. The article's all right, but what really got me thinking was this sentence mid-way through the piece:

[Paul] Andrews[, a Seattle Times columnist,] was surprised to learn recently that Jim Allchin, Microsoft group vice president of platforms, didn’t realize that many users don’t buy new computers because of how hard it is to move all their data and applications. “He was totally oblivious to this,” Andrews says. “It’s a couple-day process. His head was in the clouds.”

And that got me thinking - how does a computer novice upgrade his or her home computer and keep their important files? Take a friend of mine who shall remain nameless. He is one of the most novice computer users I've met, using his computer primarily for downloading songs, email, and the Internet. Now how in the world is he going to upgrade his system and keep those songs he downloaded, along with his Outlook email file, which last time I checked was over 1 GB in size?

If it were me, add the hard drive to the new machine, dump the files over to the new computer's hard drive, and then remove the old one. Or maybe I'd network the two up in an ad-hoc ethernet configuration. But my friend doesn't know how to take out a hard drive, let alone what one even looks like. He knows how to plug devices into his computer because they are color coded, so I doubt he'd be able to build a network (which would require either a hub or a crimper to rewire the ethernet cable). Burning the several gigs of songs and email content to CD would seem too daunting. And even if my friend did manage to get the files over to his new computer, I doubt he'd know what to do next. He plays his MP3s through Kazaa's media player screen. He doesn't know how to tell Outlook to use a different PST file than the default one it is configured to use.

So how do computer novices upgrade and keep the hundreds of MBs, or even GBs of important data? Do they just not upgrade? Do they take it to BestBuy and ask their techies to do it for them? Do they rely on their computer geek friends? I wonder... and I'd wager this would become more and more of an issue as hard drive sizes keep increasing, and as people start using their computers as a media store for movies, TV shows, and so forth, resulting in very large amounts of data to transfer to new systems.

Filed under:
Understanding ASP.NET View State
03 June 04 09:28 AM | Scott Mitchell

My latest MSDN article is now online, and covers a topic that many developers do not have as tight a grasp on as they may think. Understanding ASP.NET View State looks at how an ASP.NET page maintains its state changes across postbacks, examining:

  • The ASP.NET Page Life Cycle
  • The Role of View State
  • The Cost of View State
  • How View State is Serialized/Deserialized
  • Specifying Where to Store the View State Information (see how to store it in a file on the Web server rather than as a bloated hidden form field)
  • Programmatically Parsing the View State
  • View State and Security Implications

A special thanks to those who helped review this article, especially those whose feedback was used to improve the article. If you'd be interested in reviewing articles, read this blog entry for more information...

Problems Debugging ASP.NET - Another Gotcha
01 June 04 07:33 PM | Scott Mitchell

I think it's fair to say that if you've never had an “issue” debugging an ASP.NET application through Visual Studio .NET, you're either relatively new to ASP.NET or you're extremely lucky. You don't have to look much further than the microsoft.public.vsnet.debugging newsgroup to find a number of folks who have had problems debugging ASP.NET pages in Visual Studio .NET. For some common problems (and their solutions) see Roy Osherove's blog post The VS.NET 7 Debugger Doesn't Work. What Can I Do?

Yesterday I experienced another ASP.NET debugging problem that isn't addressed at Roy's post, or, to my knowledge, in the Microsoft KBs. I was working on a computer that had the ASP.NET 2.0 technology preview installed, which had tweaked IIS so that the ASP.NET-related file extensions - .aspx, .asmx, etc. - were mapped to the ASP.NET 2.0 ISAPI extension (WINDOWS\Microsoft.Net\Framework\v.1.2.xxxx\aspnet_isapi.exe), rather than the ASP.NET v1.1 assemblies (WINDOWS\Microsoft.Net\Framework\v.1.1.xxxx\aspnet_isapi.exe). With this setting, ASP.NET pages would work fine when visiting them without attaching the debugger, but attempting to attach the debugger resulted in an error. The solution was to reconfigure IIS to have the file extensions mapped to the v1.1 ISAPI extension.

This, sadly, was the last place I looked. I spent near an hour pouring through newsgroup posts trying a long list of different suggestions and techniques. The point is, if you have the Whidbey bits installed on your computer, and are having troubles debugging ASP.NET 1.x projects, consider checking the AppMappings in IIS first, before pulling your hair out! :-)

Filed under:
More Posts


My Books

  • Teach Yourself ASP.NET 4 in 24 Hours
  • Teach Yourself ASP.NET 3.5 in 24 Hours
  • Teach Yourself ASP.NET 2.0 in 24 Hours
  • ASP.NET Data Web Controls Kick Start
  • ASP.NET: Tips, Tutorials, and Code
  • Designing Active Server Pages
  • Teach Yourself Active Server Pages 3.0 in 21 Days

I am a Microsoft MVP for ASP.NET.

I am an ASPInsider.