August 2005 - Posts

Review of QuickWebSoft.com's ColorPicker Web Control
31 August 05 03:00 PM | Scott Mitchell | with no comments

At the beginning of this month I started work on a project that allowed the users to customize the appearance of the site to a great degree. One such customization was in the colors the site used: background colors, menu colors, text colors, popup box colors, and so on. Rather than limiting the user from picking from a drop-down list of color names, or forcing them to know the hex color comibnation (which, honestly, only a Web developer would even know about), my client wanted users to be able to select a color via a color picker control.

The windows world abounds with color picker controls, but there are fewer ASP.NET-based color pickers, as my research found. I started by asking a question on the ASP.NET Forums: Recommend a Color Picker ASP.NET Web Control? I got a couple of suggestions, but wasn't too thrilled with either of them. Eventually I found QuickWebSoft.com's ASP.NET ColorPicker control. My client liked the functionality, cross-browser support (tested on IE, Opera, and FF for Windows and Safari for Apple), and price ($49 for a single server license), so I ended up using that particular color picker control. (There are some live demos you can try out at http://www.quickwebsoft.com/ColorPicker/ColorPicker.aspx#Demo.)

Overall, the ColorPicker control was easy to setup and start using. It took just a minute or two to get the requisite resource files copied to the aspnet_client folder, move the assembly to the /bin directory, and add the control to the VS.NET Toolbox. Once that was accomplished, I could simply drag and drop the control onto the page and determine the value entered by the user through the ColorName, ColorValue, and/or Color properties of the control. Couldn't be easier.

There were a couple of downsides to the control, however, that were quickly apparent. One issue was that the Enabled property didn't work; that is, even when the control's Enabled property was set to False, the user could still pick a color. Another issue was that each ColorPicker instance emitted the client-side JavaScript it needed, even though this script was identical for all ColorPickers on the page that used the same palette data. This script was around 10 KB. On one page I had 12 color pickers, which led to over 120 KB of client-side script, of which only about 10 KB was unique!

I emailed QuickWebSoft.com's support with these two issues and they acknowledged the problems and informed me that these issues would be fixed in a future version. What was highly unexpected, however, is that this next version was sitting in my Inbox the next morning. I must have sparked an all night coding session with my requests! :-) (One thing to be a tad wary of, but you can tell that this is, likely, a one-man shop here. The support emails usually take about 12 hours to hear back from and there are no online support forums where you can ask questions or where answers to your questions may already reside...)

The ColorPicker works well for simple use cases, but in more involved cases I had to exert quite a bit of effort to get it to 'play nice.' For example, the ColorPicker has a server-side event, ColorPicked, that you can have fired when a color is selected. However, if you want to have some client-side action take place when a color is picked you're pretty much on your own. Sure, there's a property that you can set that will run any JavaScript script you set to it, but you still have to write the client-side script that reads or sets the color value, which, while not terribly complex, still took a good half hour to hammer out correctly. Another gripe is that the palette files ColorPicker ships with or can be downloaded from the website are only the 216 web safe colors (or less!). For this application I wanted to allow users to pick from a wider array of colors, much like the standard MS Office color picker, where there's a fine gradient. Sure, you can create your own palette files, but this project's time and budget did not account for that. One last gripe - there isn't an option to include a textbox into which a user can just type in a color. For example, if a user knows that their logo's background color is, say, #AA7601, they may want to just type that in rather than having to peck through the color picker (or, worse yet, if the color isn't available in the ColorPicker's palette). The main challenge in adding a textbox was wiring up all the client-side events to ensure that after entering a value in the textbox, the color would be displayed properly, and when picking a color from the color picker, the color value would be inserted automatically into the textbox.

Overall, I do recommend QuickWebSoft.com's ColorPicker if you need a simple color picker control for a very affordable price. If you need to do any great deal of customization of your color picker, or need to be able to tinker around with it in a client-side setting, you might want to take sufficient time trying out this and other options before settling on one.

Filed under:
Upcoming One-Day Lecture on .NET 2.0 (San Diego, CA)
30 August 05 10:12 AM | Scott Mitchell | with no comments

On Saturday, September 10th University of California - San Diego Extension will be putting on a one-day lecture serving as an introduction to .NET version 2.0! We'll be discussing what's new in version 2.0 along with tips for migrating your existing applications. The event will be held at the Salk building in UC-San Diego's beautiful campus.

The day of lectures includes four talks and a speaker's panel / wrap-up discussion. The speakers at this event include:

These talks cover smart clients, ASP.NET 2.0, the .NET Framework version 2.0, and language migration. The day-long lecture runs from 9:00 AM to 3:00 PM with lunch included. All this for a scant $95!!

If you're interested in learning more about this event or are interested in enrolling, visit http://extension.ucsd.edu/studyarea/index.cfm?vCourse=CSE-40940. Hope to see you there!

Filed under:
FeedBurner and Changing a Blog's Feed URL
28 August 05 05:23 PM | Scott Mitchell | with no comments

This weekend I moved over my RSS feed - previously http://ScottOnWriting.NET/sowBlog/Rss.aspx - to a feed managed by FeedBurner (http://feeds.feedburner.com/ScottOnWriting). FeedBurner serves as a sort of feed URL proxy. Basically you give FeedBurner a link to your RSS feed and it creates a feed based on that feed. You then point your subscribers to the FeedBurner feed and FeedBurner serves up your site's content, maintains statistics on who's subscribing to your blog, and so on.

I decided to move to FeedBurner to realize three benefits (keep in mind that ScottOnWriting.NET (still) runs off of an old version of Scott Watermasysk's .Text blogging engine, as I've yet to upgrade to Community Server; previous to today, I was actually using a pre-0.94 version, but today "upgraded" to the official 0.94 release downloadable from the .Text GotDotNet Workspace):

  1. Subscription statistics - FeedBurner provides a number of free statistics, including number of subscribers, number of requests, and aggregator breakdown.
  2. Someone else handles the bandwidth - currently requests to the RSS feed on ScottOnWriting.NET consume roughly 1.5 GB of traffic per week, or 6 GB of traffic per month (in total, ScottOnWriting does about 11 GB of traffic per month). That's a lot of 1s and 0s that would be nice to offload to another party. (I don't believe the pre-0.94 version of .Text I was using supported conditional HTTP GETs (although if I'm not mistaken the "official" 0.94 release does; had I been using a version that supported conditional GETs this bandwidth requirement would be an order of magnitude lower, I'd wager, perhaps just a GB for the month.) (To clarify, while FeedBurner does make requests to the blog's RSS URL, it caches the results for a period of time, thereby reducing the bandwidth demands for my server.)
  3. FeedBurner has a couple of neat “publicizing“ tools - FeedBurner includes a number of tools to easily make links to add your blog to My Yahoo!, MyMSN, newgator Online, and so on. Additionally, there are nifty little tools you can use to “show off“ how many folks subscribe to your blog, a la:

When changing over your RSS feed URL the main challenge is making sure that your existing subscriber base starts to use the new feed URL. There are, to my knowledge, to ways this can be done, with the first of the two ways being the ideal way:

  1. Have your old feed URL emit an HTTP 301 status code - The HTTP 301 status code is a message from the server to the client saying, “Hey, this resource has been permanently moved to URL xyz.“ The client, then, can make a new request to the specified URL; too, if there's some database being used to track the URL, this message informs the client that it's time to update the database and use the new location. If I'm not mistaken, virtually all modern aggregators support HTTP 301 status codes and will automatically update a site's feed URL to use the newly specified location.
  2. Tell people of your new feed URL - if you do not have control over your blog website you may not be able to take the steps needed to replace the current feed URL with an HTTP 301 status code. In this case, the only approach I know of to inform users of the new feed URL is simply through word of mouth. That is, you'll just have to post on your blog an entry telling users to update their aggregators. As Kent Sharkey has noted, though, the results may be somewhat disappointing.

Since I run ScottOnWriting.NET myself (well, through a web hosting company), I have control over these matters. The only challenge, then, was getting .Text to play nice. In .Text version 0.94 the site's RSS feed comes from a file named Rss.aspx. This file, though, does not actually exist; rather, in the Web.config file all requests are handed off to a .Text HTTP Handler. When a request comes in for Rss.aspx, .Text generates the appropriate output.

To get Rss.aspx replaced with an HTTP 301 status code, the first step is to create an Rss.aspx file in your blog's root directory. The code needed for this page is alarmingly simple - all you want to do is return an HTTP 301 specifying the new feed URL, like so:

<script runat="server" language="C#">
void Page_Load(object sender, EventArgs e)
{
Response.Status = "301 Moved Permanently";
Response.AddHeader("Location", "
http://feeds.feedburner.com/ScottOnWriting");
}
</script>

(Of course replace the http://feeds.feedburner.com/ScottOnWriting Location header value with the URL of your new RSS feed...)

Creating this file is not enough. In fact, even after creating this file if you visit Rss.aspx through your browser you'll still see the complete RSS feed rather than being auto-redirected to the specified URL. This is because the ASP.NET engine is handing off the request to the .Text HTTP Handler rather than handling the request itself. If you look at the <httpHandlers> section in the Web.config file you'll find an entry like:

<add verb="*" path="*.aspx" type="Dottext.Framework.UrlManager.UrlReWriteHandlerFactory,Dottext.Framework" />

This entry says, “Any request for an ASP.NET page should be handled by the class Dottext.Framework.UrlManager.UrlReWriteHandlerFactory,Dottext.Framework,” and HTTP Handler. This includes requests for Rss.aspx. Hence we need to add the following line to the <httpHandlers> section:

<add verb="*" path="Rss.aspx" type="System.Web.UI.PageHandlerFactory" />

That tells the ASP.NET engine to take care of requests to Rss.aspx. At first I naively thought that I was done, but I had just unwittingly setup an infinite loop! When a request comes into Rss.aspx, it sends back a 301 status code to the client, saying, “No, no, no, you want to go to this FeedBurner URL.“ This is what we want to tell people coming through a browser or aggregator, but remember that FeedBurner also needs to know the URL of the site's feed, which, at this point, I had set simply as Rss.aspx! So when FeedBurner periodically checked to see if a new version of my feed was available it requested Rss.aspx, which told it to check itself, which says to check Rss.aspx, which says to check itself, which... you get the point.

Instead, what I needed to do was rename .Text's Rss.aspx to something else, like RssFromText.aspx and instruct FeedBurner to use this alternate, “secretive” feed URL. With this setup, a user who already subscribes to ScottOnWriting.NET through Rss.aspx will automatically be switched over to FeedBurner. FeedBurner's RSS content will be populated from RssFromText.aspx, which will be generated from .Text, reflecting the most recent blog entries. No more infinite loops!

To accomplish this I had to edit blog.config to tell .Text that it should use RssFromText.aspx as its RSS feed URL as opposed to Rss.aspx. This involved updating the appropriate <HttpHandler> line like so:

<HttpHandler Pattern = "(?:\/RssFromText.aspx)$" Type = "Dottext.Framework.Syndication.RssHandler, Dottext.Framework" HandlerType = "Direct" />

With this addition, requests now to Rss.aspx are sent back an HTTP 301 status code, but FeedBurner can still slurp down the site's content through RssFromText.aspx.

Next, you'll probably want to update the link to the site's feed in the My Links section (since this will point the users to Rss.aspx, but you want them to go directly to the FeedBurner link). (In actuality, this step is probably optional since even if you do leave it as Rss.aspx, when the attempt to view that page through a browser or slurp it through an aggregator, they'll get the 301 status code and auto-redirect to the FeedBurner URL... but still, for completeness let's change this link.) To accomplish this, simply edit the MyLinks.ascx file in the ~/Skins/skin_name/Controls/ directory. With version 0.94 you'll find two HyperLink controls that .Text automatically looks for and sets their NavigateUrl properties to Rss.aspx - these controls have IDs Syndication and XMLLink. Even if you explicitly set the NavigateUrl properties to your FeedBurner URL, .Text will overwrite it and the link will be rendered as Rss.aspx. If you try to simply remove these HyperLink controls you'll find that (at least with 0.94) .Text will barf. What I did was simply set their Visible property to False. I then added two HyperLink Web controls of my own that referenced the new feed URL.

There's one more facet that should be changed, although I've not made the change since (to my understanding) you'd need to actually hack the .Text source code, recompile, and re-deploy. In the portion of the web pages in your blog you'll find a tag that's used for RSS feed auto-discovery:

<link rel="alternate" href="http://scottonwriting.net/sowblog/rss.aspx" type="application/rss+xml" title="RSS" >

Ideally the href attribute would contain the URL of your new feed... but I didn't feel like going through the headache of pecking through the source, making a change, testing, and so forth. So I just left it as-is, figuring in the worst case someone will “discover” my feed to be Rss.aspx, which will automatically be updated to the FeedBurner syndication URL as soon as their aggregator makes its first request to Rss.aspx.

The FeedBurner service looks pretty cool upon first glance. Once this new feed URL gets some use and I get some metrics in FeedBurner's database, I plan on sharing some of the stats... it'll be interesting to see what the most popular aggregator out there is for those who are ASP.NET developers (the primary audience of this blog, I imagine), among other data points.

Technology and Poverty
26 August 05 12:00 AM | Scott Mitchell | with no comments

SoCal developer Rob Walling recently returned from a three week trip to Ghana, West Africa where he spent his time training the people of Ghana how to build websites. He's got a great piece on his site, Using Technology to Fight Poverty, which gives some details about his trip and some insight into the hopes of many of the developers Rob met and the great poverty that affects Ghana and many other nations like it:

One of the employees at the IT facility where I lead website training asked how he could learn ASP and PHP (a couple of common web programming languages). He wants to learn web development so he can move to the UK in search of a better life. Since there is very little computer work in Ghana, someone who acquires technical skills quickly leaves the country in search of opportunity. Can you say brain drain?

In any case, since classroom learning is expensive and since I've always been more of a "teach yourself" type of person, I recommended two books to get him started. He looked at me sheepishly and told me that technical books are really expensive. This struck me as odd because I had talked to a guy from the UK just two days earlier and he mentioned a bookstore with technical books at around 70% off cover price, so a $50 book was only $15. Feeling well-informed I began to tell him, but the instant the words came out of my mouth I noticed a look on his face and realized that this is an outrageous amount of money for him. Luckily, I quickly righted the ship, adding "...but I realize that is still very expensive."

15 bucks. The guy works 40 hours a week at an IT training facility and can't afford a $15 computer book. He's not starving. He's not living in a mud hut on the side of the road scraping to feed his family. But $15 is probably a week's salary for him, maybe more. At 83 times the minimum wage this book would cost $427 in the U.S., and the book was actually an old edition (from 2001), which as most of us know is almost worthless in the world of computer programming. If he wanted a current edition he would have to pay three times that if he could find it at all.

Does this seem wrong to anyone else?

Rob's post goes on to cite numerous facts about the troubles facing Ghana and other impoverished nations, along with why the average Westerner should care. He also includes a variety of ways you can help. Rob's main thesis is that with globalism and the power of the Internet, one ought to utilize technology to help improve the plight of the Ghana citizen:

One possible approach for helping the lower class, who is able to survive day to day but is in dire need of a higher standard of living, would be to take advantage of the global economy through e-commerce.

By setting up an online store through eBay or Yahoo!, a Ghanaian drum-maker could dramatically increase his market of potential buyers while increasing his profit margins. This idea, though not using the internet, has been executed with great success in Central America where Westerners have set up co-ops where craftspeople create products and elect one person to handle the business aspects. Modifying the idea for the internet, one would find a local with computer skills (the coordinator) and put him in charge of maintaining the online store. The remaining craftspeople would be notified when they needed to ship an item via an email, phone call, or a knock on their door. A small cut from each item would go to cover overhead: the cost of visits to an internet cafe, plus the coordinator's wages.

One real-life example is a 12" Djembe drum that sells for $25 in the local marketplace in Ghana runs $90 on eBay. The additional profit on the sale of one drum would cover the cost of internet access, eBay fees, and a large portion, if not all, of the coordinator's salary.

It's an interesting idea, although I wonder how much of that profit margin will see its way to the actual drum makers. (Call me cynical.)

If you're interested in reading more about Rob's travels abroad, check out this entry: Back from Africa: My Glimpse of the Digital Divide.

Filed under:
Enhancements to skmLinkButton
24 August 05 12:45 AM | Scott Mitchell | with no comments

In a previous blog entry I talked about skmLinkButton, a simple, open-source enhancement to the built-in ASP.NET LinkButton Web control. skmLinkButton includes properties to display text in the browser's status bar when mousing over the link as well properties to easily add a client-side confirm messagebox when clicking the link.

I recently received an email inquiry in my Inbox from loyal 4Guys reader David P., who asked:

I'm writing to you because I think you may be able to help me. I enjoyed your article about overriding Linkbutton very much, but I have a further need that you may have a solution to. A LinkButton is rendered like <a href="javascript:__doPostBack(.....)" onClick="alert('pressed')">Click me</a>.

My problem is that if I SHIFT+CLICK the LinkButton [or right-click on it and opt to open in a new window], a new window opens [with the JavaScript, javascript:__doPostBack(.....), in the Address bar, resulting in a client-script error and a confused user.] ... I want to avoid that. I've found a solution:

<a href='#' onClick="__doPostBack(....);">Click me</a>

My problem is that I don't know which attribute to modify so that the Href value is moved to the onClick event... It must be done in an inherited control or codebehind.

(To see an example of David's problem, click here. The hyperlink's markup is simply <a href="javascript:var x = 4;" target="_blank">click here</a>. Note that clicking the link opens a new window that displays the JavaScript in the address bar and returns an error. This mimics the behavior of an ASP.NET LinkButton that is clicked to have opened in a new window.)

I tinkered with skmLinkButton to do what David requested. The end result can be read about in this week's 4GuysFromRolla.com article, Stopping JavaScript Errors When Opening a LinkButton in a New Window. There are live demos to tinker around with over at http://scottonwriting.net/demos/skmLinkButtonDemo.aspx. You can download the complete source code, along with an example ASP.NET page and a prec-compiled assembly, at http://aspnet.4guysfromrolla.com/code/skmLinkButton.zip.

Filed under:
Requiring HTTPS for Certain ASP.NET Pages
17 August 05 12:38 AM | Scott Mitchell | with no comments

I'm currently working on a project that has SSL capabilities. Users reach the site through a partner site, with the user's session using SSL only if the partner site also used secure communications. There's also an administrative section that, regardless of whether or not the user entered using SSL, must be accessed via SSL. How, then, do you ensure that when a user visits a particular page or subset of pages that they do so via a secure channel?

There are a couple of techniques that I'm aware of. Probably the most sound way is to make the setting through IIS. If you go to the IIS metabase you can right-click on the folder or website you want to require be accessed only through SSL and go to Properties. Next, tab over to the Directory or File Security tab (depending on if you're configuring a directory or file) and, in the Secure Communications section, click the Edit button. This will bring up the Secure Communications dialog box; from there you can click the "Require secure channel" checkbox and, voila, the directory or file can now only be visited through HTTPS.

Once you have made this setting if a client attempts to visit such a configured resource through HTTP (rather than HTTPS) an HTTP Error 403.4 - Forbidden: SSL is required to view this resource.

You can also require SSL programmatically through your ASP.NET pages' code-behind classes, which can be useful if you don't have direct access to the web server's metabase to make the settings as described above). (Realize that setting the "Require secure channel" option through IIS has the advantage that it requires a secure channel for all types of resources - ASP.NET pages, HTML pages, images, and so on. By making this setting through ASP.NET - either in a code-behind class or in an HTTP Module - will only require SSL in resources requests that IIS hands off the the ASP.NET worker process.)

With the ASP.NET code-behind technique you basically add a bit of code to your code-behind class (or, better yet, a base class or HTTP Module) that checks to see if the request is through a secure channel; if not, it redirects the user to the same URL but through HTTPS. An example can be seen in this blog entry: 443 <--> 80 - Seamlessly moving requests in and out of SSL.

In my project I ended up using both techniques, actually. For the ASP.NET pages in the administrative interface I used a technique very similar to the one described in the blog entry. My main difference, though, was that I added a check to see if the incoming request was coming through localhost. If it was, then I didn't sweat the HTTP --> HTTPS translation. I just stayed with HTTP. (I did this because locally I do not have an SSL cert; sure, I could easily create and setup one, but why bother?) I went with the programmatic approach because the user might already 'legally' be on the site through HTTP and then click on a link to go to the admin page. I guess I could have went back and ensured that all admin links were fully qualified with URLs starting with https://, but instead I opted to let the person hit the admin page through HTTP only to be auto-redirected to the same page, but through HTTPS. I used the IIS approach in a couple of places where I had .htm files that needed to be protected. Also, I am using ELMAH on the site and wanted to ensure that its error log viewing page (elmah/default.aspx) could only be viewed through SSL, so potentially sensitive information couldn't 'accidentally' be sent over an insecure channel.

In closing, here are some additional resources I found on this topic that are worth reading:

Filed under:
DVXP's TEdit.NET Control
10 August 05 12:31 AM | Scott Mitchell | with no comments

My last blog entry provided a list of commercial, third-party ASP.NET server controls I have used in past projects, along with a mini-review of each. Today I added another notch to the proverbial commercial ASP.NET server control headboard: DVXP's TEdit.NET.

TEdit, which stands for Table Edit, is a Web control designed to make viewing/editing/inserting/deleting data from a database table as easy as point-and-click. And it does a pretty good job at that. In past projects I've always extended the DataGrid in some manner to show, edit, delete, etc. data from a database table. TEdit.NET makes this job much easier, but does introduce a bit of a learning curve for those who have always used the DataGrid.

First, rather than specifying the “structure” of the grid in the ASP.NET page's declarative syntax, a separate XML configuration file is used. Furthermore, with TEdit.NET you need to provide either a table name, view, or stored procedure that has the data to bind to the grid. If, like me, you've spent the time to build up a rich middle tier with an assortment of business objects and classes that provide the data to the presentation tier, you're SOL, as far as I can tell - with TEdit.NET there's a tight coupling to the data model.

While this is an annoyance and a potential maintainance issue for large ASP.NET applications, I think TEdit.NET is a killer app for small applications. Once you get it set up you get, without writing a lick of code, the ability to:

  • Sort data
  • Search on data
  • Page through data
  • Edit data
  • Insert data
  • Delete data
  • Cache data

Granted, the GridView and DataSource controls in ASP.NET 2.0 provide a lot of this out of the box, but with TEdit.NET I've found it amazingly easy to make an impressively nice-looking and very intuitive table editor with minimal effort.

TEdit.NET's usage and syntax is a bit different from the DataGrid's, so, as I mentioned earlier, there is a bit of a learning curve for those of us who have always used DataGrids to provide access to backend data. For example, there are different events and different “patterns” used for formatting the data in the grid, iterating through the records of the grid, accessing a particular record's column values, and so on. If you are familiar with DataTables, though, you'll likely find this learning curve to be quite flat, as TEdit.NET basically provides programmatic access to the DataTable that serves as the foundation of TEdit.NET's data.

If you have a smallish ASP.NET application where you need to create professional-looking interfaces for editing backend data, I'd highly recommend TEdit.NET. At $249.00 for a single server license, I think you'll find this control pays for itself in time saved within the first week.

Filed under:
Third-Party, Commercial ASP.NET Components I've Used
04 August 05 11:49 AM | Scott Mitchell | with no comments

Over a number of ASP.NET projects I've done for companies I've used a variety of third-party, commercial ASP.NET components to accomplish a bevy of common tasks. Oftentimes it makes sense for a client to plunker down the money for a pre-built app that I can plug into his application rather than paying me to create the needed functionality from scratch. I thought I'd share the list of third-party components I've used in the past, along with a (very) short review of each. I'd also be interested to hear what third-party ASP.NET components you've used in the past, and what you thought of them. (Of course, for a more lengthy list of third-party ASP.NET components - some commercial, some free - check out the ASP.NET Control Gallery.)

  • r.a.d. menu - despite having created my own free, open-source ASP.NET menu component (skmMenu), I have used r.a.d. menu in a number of applications. (The reasons for using this over skmMenu have varied. In some cases the client had already purchased r.a.d. menu; r.a.d. menu also offers better cross-browser support than skmMenu and can be customized more easily to provide a more professional look. There have been projects, though, in which I have used skmMenu, so don't let my use of 'competitor' menu components deter you from checking out skmMenu!) I like r.a.d. menu, as it's always been easy to setup and get working with. I've used it in situations both invovling rather static, boring menus, and ones where the entire menu structure was dynamically generated based on various parameters (the logged in user, the current data being worked on, the state of the system, and so on).
  • r.a.d. spell - while I've really liked r.a.d. menu, unfortunately I've not been nearly as impressed with r.a.d. spell. r.a.d. spell provides a client-side spell checker, and while it's realtively easy to setup and looks slick, I've had many random problems reported from various users on sites that I've used r.a.d. spell on. Complaints ranging from dictionary suggestions that seem 'off' to cryptic, client-side error messages when attempting to spell check.
  • Peter's Date Package and Professional Validation and More (VAM) - I'm a big fan of both of these products from Peter Blum. In one project I use the DateTextBox, CurrencyTextBox, DecimalTextBox, and IntegerTextBox like nobody's business. The end users love them, as they use client-side JavaScript to restrict the data being entered and have various bells and whistles (such as the nice-looking drop-down calendar in the DateTextBox, little up and down arrows to increment/decrement the value in the IntegerTextBox, DecimalTextBox, and CurrencyTextBox, and so on. I've yet to explore the true depth of the extra validation controls and capabilities, but expect those features to become more useful as this project evolves. But, seriously, the 'masked' TextBoxes alone made the entire purchase totally worthwhile. (In fact, in an earlier blog entry I carried on about my affection for Peter Blum's controls.)
  • aspNet Email - I had a client who needed to blast out customized emails to thousands of registered users on his site, and he contacted me for advice. I recommended Dave Wanta's aspNet Email component, which is not only easy to setup and blindingly fast in shooting emails out the door, but has a MailMerge() method that made writing the entire application about a thirty minute endeavor.
  • Tall PDF - I've not worked with this component in great enough detail to give it much of a review. I was able to accomplish what I needed to with it - building up a rather simple, two-page report in a PDF document - and didn't have too hard of a time doing it. The major disappointment was that the resolution of the PDF file seemed a bit low - that is, I could not import an image (specifically a graph that needed to appear in the report) unless it was like less than 460 pixels wide and like 620 pixels high. Kind of a bummer because I had to shrink down the dynamically generated graph in order to get it to fit, thereby losing some of the detail. But other than that annoyance, no complaints.
  • Dundas Chart for .NET - I haven't used Dundas in a client's application, but I was given a free copy from the kind folks at Dundas about half a year ago and have used the component in a couple of my own, non-public web applications. I was impressed by the look and feel and the ease with which I could chart data. One such 'private' application I've used this component in is one that I use to track my weight and caloric intake. Here's a graph showing my weight over the last six months or so. Creating this chart basically involved just a half dozen lines of code, and it's appearance can be easily configured through the Visual Studio .NET Designer. Personally I think it's a pretty snazzy looking chart considering my artistic skills!

I'm sure there are a handful of controls I'm forgetting, but the above list gives a smattering of the commercial ASP.NET components I'm currently using and have used in the past.

Care to add a comment about one of the commercial controls listed above? Want to add mini-review on a 3rd-party ASP.NET component you've used? If so, simply add a comment.

Filed under:
More Posts

Archives

My Books

  • Teach Yourself ASP.NET 4 in 24 Hours
  • Teach Yourself ASP.NET 3.5 in 24 Hours
  • Teach Yourself ASP.NET 2.0 in 24 Hours
  • ASP.NET Data Web Controls Kick Start
  • ASP.NET: Tips, Tutorials, and Code
  • Designing Active Server Pages
  • Teach Yourself Active Server Pages 3.0 in 21 Days

I am a Microsoft MVP for ASP.NET.

I am an ASPInsider.