March 2005 - Posts

Upcoming Speaking Events
29 March 05 02:27 PM | Scott Mitchell

Last week I got back from speaking at the ASP.NET Connections conference - thanks to all who attended my sessions. There were a number of great questions and I felt that all talks went well (although my last talk - the third in the same day - felt the weakest of the three). Anywho, for those in Southern California I have two speaking events coming up within the near future:

  1. A Look at the Enterprise Library's Data Access Application Block - this talk will be for the San Diego ASP.NET SIG at 6:30 pm on April 17th at the Microsoft SoCal Office in UTC. The talk will provide an introduction to the Enterprise Library, focusing on using the Data Access Application Block portion to reduce code and improve data-driven applications.
  2. Working with HTTP Handlers and Modules - on Saturday, May 7th there's an all-day conference being held at the Long Beach Marriott. The conference has four session tracks and boasts an all-star roster of speakers: Rocky Lhotka, Dan Appleman, Michele Leroux-Bustamante, Tim Huckaby, and others. My session will be from 2:45-4:00 pm in the Web Development track. The conference runs $79 if you register by April 15th, $99 otherwise; more at http://www.socalnetevents.org/Default.aspx?tabid=55.
Filed under:
Looking Ahead to 2.0...
28 March 05 06:25 PM | Scott Mitchell

With the ASP.NET 2.0 Beta 2 slated to ship this week - this is still the game plan, no? - I'm starting to prepare myself mentally for my next book, which I've yet to start, but plan on hammering out the TOC within a week after Beta 2 ships, and start writing soon afterward. To be honest, I haven't played around much with 2.0, other than digging into the GridView pretty extensively, and working with Generics. But many of the new, cool ASP.NET 2.0 features - profiles, the mryiad of new server controls, ControlState, client script callbacks, etc. - I haven't used at all or, if I have, it's been just a cursory examination.

Today I stumbled upon Rick Strahl's latest blog entry, titled What, No Whidbey Version? Rick, who sells a WinForms app, is a bit peeved at people who are moving production code to beta bits and who are likewise on his back for having a 2.0-compliant version of his applications at this time. What I related best to was Rick's comment on 2.0 material out in the community - books, conference sessions, magazine articles, etc.:

I’m finding that magazines have little interest in articles of the current version. Same with .NET conferences. Submit 1.x or more ‘general’ topics – even if they are unique and your chances of getting picked go down pretty drastically.

I have noted the same tendencies. Personally, I think it's a shame. Yes, looking forward is fun and cool and a great way to get a jump start on a new technology, but there are real people doing real work today with 1.x. Who's providing information to them? In fact, I can be counted in that set of people, as the consulting work I do is, and will be for the forseeable future, rooted heavily in 1.x. Sure, it's cool that there are a ton of 2.0 articles out there, but, at this point, MSDN Magazine is worthless to me, since it's about 95% focused on 2.0. (I understand the purpose of the magazine is to be forward looking, but the challenges I face and the clients who pay me are more concerned about the present than what will be shipped (hopefully) by the end of this year. I'm looking forward to moving to 2.0 eventually. It's just that my clients want production-ready code written today.)

Overall, I think it is time for 2.0. 2.0 solves many of the headaches that are around today. Furthermore, .NET 1.x is pretty old, all things considered. I wrote my first classic ASP page in January of 1998. I wrote my first ASP.NET page in May 2000. That's roughly 2.5 years of ASP use, from which I moved from ASP 2.0 to ASP 3.0. I've now been using ASP.NET (rather exclusively, thank the gods) for nearly five years. That's almost twice as long as my tenure with classic ASP, and the only version change was from 1.0 to 1.1. Yes, it's time for a new version, time to fix the woes of the current version, but, like Rick, I'd rather not be innundated with 2.0 information until it's closer to shipping. (Correction: it's not the deluge of 2.0 info that is frustrating, it's the replacing of 1.x information with 2.0 information that's vexing.)

I'll leave you with Kirk Allen Scott's two looks back at the last five years of .NET:

I would promote Kirk's #3 worst - tight coupling of Visual Studio to IIS - as the #1 worst, but as an ASP.NET instructor who has to grade projects that are submitted as Web applications I may be slightly biased.

Filed under:
Don't Forget to run the Install Services script after installing the Enterprise Library!
19 March 05 07:03 PM | Scott Mitchell

Since my first article on 4Guys covering the Enterprise Library - An Introduction to the Enterprise Library - one question I have received at least a dozen times is the following:

When running an Enterprise Library example on my computer I get the following exception:

System.Security.SecurityException: Requested registry access is not allowed

Failed to create instances of performance counter '# of Connection Failures/Sec' - The requested Performance Counter is not a custom counter, it has to be initialized as ReadOnly.

Others have reported receiving this exception:

The source was not found, but some or all event logs could not be searched. Inaccessible logs: Security.

Line XXX: ds = db.ExecuteDataSet(dbCommandWrapper)

What the problem boils down to is that the performance counters used by the Enterprise Library have not been installed. As discussed in Tom Hollander's blog entry Instrumentation in Enterprise Library:

One of the primary goals of Enterprise Library is to showcase best practices for enterprise .NET development. We try to do this in multiple domains, including architecture, code standards, unit testing and operations. ... So instrumentation is very important, which is why we made sure we included it in the blocks - including event log messages, WMI events and performance counters. ... Instrumentation is enabled by default, but to make sure everything is registered you'll need to run the Install Services script from the Start Menu, or run installutil over each assembly (possibly as a part of your own MSIs). When you install Enterprise Library with default settings, all of the code will be automatically compiled, but unfortunately we didn't run the Install Services script for you. This was an unfortunate outcome - but if you remember to run the script yourself then everything should work well.

Essentially the Data Access Appliation Block (DAAB) portion of the Enterprise Library attempts to log instrumentation information to performance counters. However, you must manually install these performance counters. If you do not, one of the above exceptions will occur because the EntLib can't find the performance counter it wants to write to.

Hopefully Google picks up this blog entry so folks searching on the exception can quickly find a resolution. I've also updated my two articles on 4Guys that cover the Enterprise Library - An Introduction to the Enterprise Library and Working with the Enterprise Library's Data Access Application Block (DAAB) - so that the text that discusses installing the services is red and bold. Hopefully that will catch peoples' eyes!

One last comment about the Enterprise Library - there are some great webcasts and PowerPoint slides on the Enterprise Library over at http://www.pnplive.com/ - definitely worth checking out!

Filed under:
Quick Spam Fact
17 March 05 01:43 PM | Scott Mitchell

Just cleaned out my Junk Mail folder (I use the SpamBayes Outlook plugin), which I hadn't done since February 17th. There were 5,558 pieces of spam that SpamBayes caught without having me pestered. How kind. That means, on average, I get sent 185 pieces of junk mail each and every day. Bleh.

Part of the reason I am making this blog entry is to have a recording of how much spam I received between Feb. 17th and March 17th, 2005. I want to be able to look back and see how my spam intake is faring in future months/years vs. this past month. This is the poor man's way of recording spam stats. See Raymond Chen for a more detailed look at his personal history of spam and virus mail.

Filed under:
Crystal Reports Tip for the Day: Specifying the Database Name Dynamically
09 March 05 07:09 PM | Scott Mitchell

I'm back to doing Crystal Reports for one of my long-term consulting projects... ick. For the project I work on there's two databases: a test database and the live database, the obvious difference being that the test database has test data and the live database is currently in use with live data. Both have identical schemas.

Anywho, when creating a Crystal Report through VS.NET you have to specify the datasource that you want to use for the report, which can be done through an ADO DataSet or, as commonly the case in this particular project, through a view, table, or stored procedure on the actual database. Since I am developing on the test database, I've always connected to the test database when crafting my Crystal Reports. In my ASP.NET page I programmatically display my Crystal Report by first setting the logon information using the code Eric Landes shares in his article titled Automagically Display Crystal Parameters. Specifically, in the ApplyInfo() I set the DatabaseName and other related server properties to the current database I am using, which is spelled out in the application's Web.config file. On the development server the Web.config file indicates to use the test database; on the production server, it says to use the live database.

My assumption was that setting this logon info through this manner would have the correct datasource be called from the Crystal Report when viewed on an ASP.NET page. That is, if I created the CR referencing the test database, but the CR was on the live site, it would use the live site's underlying data. Today, when pushing up my latest batch of reports to the live server, I found out that assumption was incorrect. Meh. The result - the reports on the live site were showing data from the test database. All of a sudden end users were seeing information about people named Mr. Test and Mr. Test 2. Eep.

My workaround was to augment the ApplyInfo() method, adding the following line in the For Each oCRTable In oCRTables loop:

oCRTable.Location = oCRConnectionInfo.DatabaseName & ".dbo." & oCRTable.Location.Substring(oCRTable.Location.LastIndexOf(".") + 1)

This modifies the datasource SQL used from something like testDB.dbo.TableName to realDB.dbo.TableName, where realDB is the database name specified in the Web.config file (so its using the test database for its datasource when working on the development server and the live database for its datasource when viewing a report on the live site).

Hope this saves someone else the headache I experienced! :-)

Filed under:
More Enterprise Library Goodness... and GDS Moves Out of Beta!
08 March 05 10:14 PM | Scott Mitchell

Over the weekends I've been tinkering around with Microsoft's recently released Enterprise Library and wrote an article focusing on the Data Access Application Block in the Enterprise Library for this week's 4Guys article: Working with the Enterprise Library's Data Access Application Block. This is my second article on the Enterprise Library, the first being a general introduction piece: An Introduction to the Microsoft Enterprise Library. (For those in San Diego who are interested in the Entrprise Library, I'll be giving a free user group talk on the Enterprise Library and DAAB at the San Diego ASP.NET SIG on Tuesday, April 19th.)

I've yet to have a chance to use the Enterprise Library in a real-world project - all my current projects use the DAAB version 2.0 - but I'm itching to move to the Enterprise Library, if nothing else to have the instrumentation features built into the Enterprise Library.


On an aside, Google has moved its Google Desktop Search (GDS) out of beta. I gave GDS a trial run back in its beta days but stuck with Lookout because I wanted to be able to index source code files as well as non-AIM chat logs. The latest version of GDS includes a plugin framework and there are a gaggle of plugins available, including ones for searching text files (i.e., source code) and one for Trillian Pro (my IM client). (There are also ones for searching .chm files and OpenOffice and StarOffice files as well.) In addition to these plugins, the new GDS version also searches Firefox history, PDF docs, music files, images, and video files. Not bad. All of this plug Google's world-class ease of use and lightning fast search speeds.

What really sold me on the new GDS version was the ability to search by file extension. Say I want to look at how I implemented a custom ASP.NET server control that implemented IStateManager to extend the view state loading and saving. The following search query would fit the bill:

filetype:cs +IStateManager +LoadViewState +SaveViewState

And - bang - there are all the C# source files I was interested in. Viva la Google! :-)

Comment Spam Script Gone Awry
06 March 05 09:58 PM | Scott Mitchell

As I blogged about earlier, I've altered .Text (the blog software that currently runs ScottOnWriting.NET) data model to include a trigger that, when a new item is added to the blog_Content table, checks to see if it contains either over 20 hyperlinks or contains a link to one of the URLs listed in a blog_BannedURLs table. If such a nefarious comment is found, not only is it not saved, but another table has a counter incremented to let me ascertain just how much comment spam has been stopped. Since January 24 of this year my trigger technique has caught 2,292 comment spams. Amazing and depressing at the same time.

The trigger approach looks for offending URLs in both trackbacks and comments, searching both the body of the comment as well as the poster's specified URL. One item that my trigger does not check, though, is the comment one might leave when rating a blog entry. There's no reason to check what the user enters into the (optional) comments section when rating a blog entry because the comment appears only in an email that is sent directly to me.

Today I found out that comment spammers don't necessarily check too closely whether or not their spammed content actually appears on the site. Today I received a little over 50 emails from my comment rater, chalked full of links to a sundry of adult sites purporting to have pictures of Ashley and Mary Kate. (Why someone would want to look at those anorexics is beyond me...)


Been meaning to blog more as of late, but work's been keeping me down. My tentative plans for upcoming tasks relating to this blog include:

  • Porting ScottOnWriting.NET from .Text 0.94 to Community Server, although part of me wants to wait until version 1.2, when Rob promises the API will be frozen. The nice thing about moving to Community Server is that I'll be able to move the blogs for skmMenu and RssFeed from GotDotNet - which appears to be down every other day - to Community Server Forums here on the ScottOnWriting.NET server.
  • Making a blog entry about the webcam software I wrote back in January. Earlier in this year I picked up a Logitech Webcam and wrote some software that will periodically upload the latest pic to a web server using FTP. The app's been running in the background on my machine without incident since late January and, seeing as the app's built upon a number of open-source projects, I thought it would be nice to share the source with those who are interested.
  • Blogging about ASP.NET 2.0. Prior to the latest work crunch (which started, not coincidentally, as the same date as my last blog entry), I had spent some additional time with the 2.0 bits. With Beta 2 coming out at the end of this month, I thought it would be good to get proactive and start rambling on about v Next.
Filed under:
More Posts

Archives

My Books

  • Teach Yourself ASP.NET 4 in 24 Hours
  • Teach Yourself ASP.NET 3.5 in 24 Hours
  • Teach Yourself ASP.NET 2.0 in 24 Hours
  • ASP.NET Data Web Controls Kick Start
  • ASP.NET: Tips, Tutorials, and Code
  • Designing Active Server Pages
  • Teach Yourself Active Server Pages 3.0 in 21 Days

I am a Microsoft MVP for ASP.NET.

I am an ASPInsider.