Crazy Egg Webinar (Text Version)

Below is the text version for the Crazy Egg webinar.

Narrator: Hi. I wanted to welcome you to the first in our new series of TAO-sponsored usability training webinars, which are designed to provide practical tools and techniques for improving the user experience on our Web sites.

And we're kicking off this webinar series in celebration of World Usability Day, which was last Thursday. World Usability Day, for those of you who aren't familiar with it, is an annual event highlighting the importance of good design and ease of use in everyday products and services. This year's theme is "Communication."

For this first webinar, we're going to be talking about how we can use an online tool called Crazy Egg to learn more about how our users are interacting with our Web sites. This is a really great tool, and I am very excited to be sharing it with you today because I know that there's been a lot of interest in this tool.

The screen transitions to an agenda slide with the following text:

  • Introductions
  • What is Crazy Egg?
  • Case Study: EERE home page
  • Crazy Egg strengths and contraints
  • Demo of tool

Before we start, a few housekeeping notes—I am recording this session so that anyone who can't be here today can watch it at a later time. Also, we have quite a number of folks on the phone. So I'm going to keep everyone on mute, but if you have a question, please raise your hand by clicking on this little yellow hand icon (the mouse points to the hand-and-arrow icon on the screen) in your GoTo webinar control panel, and I will unmute you during the question period. We're going to do this instead of using the question box.

During our session today, we're going to start with an introduction, then we're going to take a look at a case study using the EERE home page. We're going to talk about what Crazy Egg is and what makes it special, and at the end we're going to take a tour of the tool.

The screen transitions to a slide showing a head-and-shoulders photo of the female presenting, as well as the following text:

Who's on the phone?
About me: Wendy Littman

  • Usability expertise
  • Communication planning
  • Web writing
  • Project management

OK, I see we have quite a number of folks on the phone—I see some of our Web coordinators from EERE, the national labs, several from EES, New West, Energetics, and Anthro-Tech.

For those of you who don't already know me, my name is Wendy Littman, and I am the usability coordinator for EERE. I've been working on EERE Web sites in one capacity or another for about 10 years—first at NREL, and now at EES.

My background is primarily in technical communications. I've done several usability studies during my time at EERE, and about a year ago I took on this new role of usability coordinator at Sarah Kirchen's request.

This past year I worked with a team of ITP contractors to do some research on the ITP site audiences, and we also conducted a more formal benchmark usability study of the ITP site, and I know several of you are already familiar with that study.

The screen changes to a slide titled "What is a usable Web site?" There is a head-and-shoulders photo of a female with a word balloon containing information the narrator reads.

So today we're going to talk a little about how we can use Crazy Egg to make our sites more usable—but first what do we mean by a usable Web site? In all of the definitions of usability, the focus is on the user and not on the product. So a product's usability, then, is determined by the user's success in using that product.

Ginny Redish, who is well-known in the Web writing and plain language field—I know several of you are familiar with her work—says that usability means making sure that the people who use your Web site can use it to answer their questions, to reach their goals and to do their work productively and easily.

Usability means providing value that users can see themselves.

There are lots of ways that we can measure a Web site's usability, ranging from more formal benchmark usability studies like we did on ITP, to simple diagnostic testing using a tool like Crazy Egg.

The type of testing that you decide to do, of course, is going to depend on the research questions that you're trying to address, and so often you're going to end up using a combination of several usability testing techniques to help you answer your questions.

The screen transitions to a new slide titled "What is Crazy Egg?" and containing information that the narrator then covers in the text that follows.

Crazy Egg is one of four new online software packages that TAO has purchased for the programs to use to gather feedback on our sites remotely. Crazy Egg is a great place to start addressing some of the questions that you might have about how your users are interacting with your Web pages.

The way it works is that it collects information on where users are clicking when they come to your page during a certain time period, and then it uses that data to create heat maps and other reports of usage patterns.

And because it can help you answer questions about your pages, it's a great tool to use before a redesign to see what people are commonly using on an existing page. You can also use it after you redesign a page just to see how users are interacting with your new design. You could use it, for example, when you deploy a new element on your page, and you want to make sure that users are seeing it.

Crazy Egg is not a magic bullet, but it is a quick and easy way to answer some of your questions about how users are using your site, and often looking at the data is going to generate additional questions that you can then explore with other usability methods like a usability study.

Alright, so, anybody have any questions so far? If you do, please go ahead and raise your little hand there and I will unmute you.

OK. Not seeing any questions so far, so we will go ahead and move on ...

The screen transitions to a slide that reads, "Take a look at the EERE home page ... What questions, concerns or hunches do you have about how users interact with it?"

Alright. What I'd like to do now is a little brainstorming activity. We're going to take a look at the EERE home page together, which as you know was redesigned in February—and I want you to tell me what questions, concerns or hunches you might have about how users interact with this page. I'm going to go over to the home page now ...

The narrator opens a separate browser window displaying the EERE home page. The narrator moves the mouse to the promo box in the center content, a series of rotating images and descriptions that transition through the top EERE initiatives and stories periodically. The box contains the top four stories and allows for users to click a text link for the full story, skip through stories using arrow links or simply click on the image to view the story.

If you have something to say, if you could just go ahead and raise your hand and I will go ahead and unmute you ...

For example, as a prompt, I was curious when I saw the new page to know how users were going to interact with this new promo box here. I wanted to know if they were going to be clicking on these buttons down here, where else they might be clicking, if they just ignored it, or if they used it. Does anybody have any other questions that they were interested in, when they saw this page?

Participant 1: I was just interested to see if there was a difference in clicking between the Energy Efficiency icons and the Renewable Energy icons. Maybe the Energy Efficiency ones got more people, because they were next to the feature, and they're more in the middle of the page, or if there was the difference there.

Narrator: OK, great. Thank-you. Anybody else?

Participant 2: I was wondering if the drop-down menus get used often or not. Just curious about the drop-down.

Narrator: Alright, terrific. Anybody else? ... OK. Well, those are all great comments, and in fact, I went ahead and I anticipated a few of your research questions. I know some of you may be a little shy. So I'm going to go over to the presentation here.

The narrator opens the presentation slides again, showing a new slide with the EERE home page.

So, because some of the questions that you guys are just asking now are very similar to the questions of the study team, what we're going to do is we're going to step through a few of the study questions that the study team had, and then we'll take a look at the Crazy Egg data for this page to see what we can learn in order to help us answer some of those questions.

There were, for example, we had some questions on search: Are users utilizing the search, are they using the search help, how about the A-Z subject index?

In terms of navigation, how are users navigating back to EERE; are they clicking on the mobile site? Up here, this little link right here from the home page. How are users interacting with the global navigation? Are they clicking on the menus in the drop-down items, you know, and drop-down menus themselves, and which ones might be the most popular, for example? How are they interacting with these topics?

The narrator points to the lists of links under the headings "Energy Efficiency" and "Renewable Energy."

Which ones are the most popular; which ones are the least popular? Are they clicking on the icons, or are they just clicking on the links? That kind of thing.

The footer—the footer's actually below the fold for most people, so are people scrolling down to see the footer? Are they using things in the footer, or what might be the most popular.

The promo box, like I had mentioned earlier—how are people interacting with the promo box? There's an email sign-up here—are people using that? Things like that. What are users seeing?

The screen transitions to show the EERE home page darkened with a heat map overlay displaying hot areas or high activity links in orange and cold areas or low activity links in blue.

OK, so now we're going to take a look at the Crazy Egg data. This is actually from a Crazy Egg analysis that Anthro-Tech did for NREL right after the home page was redesigned and deployed in the spring.

As you know, the new design was quite a departure from the old way of accessing content. The new design displays topics instead of programs, and it has drop-downs, you know, for example.

So the purpose of this study was to understand more about how the new design is being used, and then to No. 1, translate those findings into a series of concrete design changes that could be implemented right away, and then No. 2, to use the data to develop additional research questions for the subsequent usability study that NREL just recently completed.

Here you can see at a glance where people are clicking—the brighter colors means more activity. Blue is where there are fewer clicks; black is where there are no clicks—and of course, knowing where there are no clicks is just as relevant as knowing where people are clicking, because they may not be interacting with something that you want them to notice. So, in that case, for example, you might want to think about moving or highlighting the item in a different way.

So this is just a quick visual to show you the areas of interest on the EERE home page back in February.

The screen returns to the presentation, with a slide titled "Click Analytics: What We Learned" and depicting a photo of a hand using a computer mouse.

Now what we're going to do is we're going to dive in and take a closer look at what we can learn from the click analysis. I'll just preface this by saying that this analysis was done right after the page was launched, so the usage patterns are a little bit different now. Also, NREL has made some changes based on this analysis, so the home page that you're seeing here is different now than it was during this test.

The screen transitions to display the EERE home page with statistics on user traffic.

So this is one of the reports that we get from Crazy Egg. And you can see that it shows us more detail about where people are clicking. From this report, for example, we learned that search is the most popular item on the home page. It actually received 7.6% of the clicks.

We also learned that next to search, the most popular content is the topically based menus with the icons. You can actually see, for example, that they are definitely very effectively drawing people's attention. The most heavily used links tend to be at the top of these pages. To address your question, Solar is the most popular.

Say, for example, that we wanted to promote Water—Water's right here—and one of the things we could do is move it up to the top of the Renewable Energy list, as an example.

We also learned that there's a few items items aren't being used as much—right now, the glossary and the multimedia, at that time. We know that users are easily overwhelmed by a lot of content (we recently saw that on the ITP study)—so Crazy Egg is a tool that you can use to help you make decisions about content you might want to keep on your home page, or another page, or content you might want to eliminate or move to another page.

The screen transitions to a darkened EERE home page with a variety of colored dots, referred to as confetti, which indicate where users are clicking.

This is called the confetti view, and we're going to talk more about it in a little while—but each dot here represents a click. So you can see that people are clicking on both sides of the split logo up here, and you can see for example that the DOE side gets more clicks than the EERE side.

From this study, we also learned that people are browsing the drop-down menus, but most commonly they're actually accessing the content using the visible links.

You can see here that the Solar link, for example, on the visible menu got 891 clicks, but the Solar menu on the drop-down only got 150 clicks. We also found that the visible links in the gray footer box at the bottom of the page are used more frequently than the drop-downs, and when we retested just recently we found that those drop-downs were being used about the same or potentially maybe even a little bit less frequently than before.

Say for example there was a desire to promote the drop-downs, we could, for example, give them a different graphical treatment and retest it. Or we might decide based on usage data to remove them. Those are the types of decisions you could make based on this data.

We also learned that the audience-based links and resources are used fairly frequently. You can see a large number of clicks there. At that point they were being used more frequently than the blogs or the Connect and Share areas.

And we also found that people are clicking, as you can see, on some static design elements. This eere.energy.gov up here; they're clicking on that. These icons near the links; they were clicking on that. And these headers down here, you can see that there are some clicks on the headers.

So, here, for example, we might decide to link the icons, which I know is something that NREL has done based on this data. And we also might want to think about making a slight design change to the headers down here on this page to make it clear whether they are clickable or not, which is also something that NREL has done.

We also see that people are interacting with the feature rotator. Many are using these fly controls down here. Some are following these actual links in the story themselves. But some are clicking directly on the image itself or on the text, as well.

So, you can see from this data that on the EERE home page it's important to be sure that the image itself and all the text is a link, for example, which it is.

The screen transitions to a darkened view of the bottom of the EERE home page, still with the confetti showing volume of clicks on various areas of the page.

We found that the footer is being used within two seconds of coming to the page, even though it's below the fold for most people. So this really indicates that people are using that gray footer and they're going straight down to it, so that's a feature that we should probably keep.

The screen changes back to the view of the darkened top part of the EERE home page, confetti view. A list of search engines and other sites is displayed on the left, with a different color range assigned to each to correspond to the confetti.

Finally, we learned that visitors to the home page are coming from a variety of different search engines and energy-related sites.

You can see from these results that about a third of our visitors are actually unreferred—meaning that they're typing the address directly into their browsers, maybe they have a bookmark, they could be clicking on a link from an email, things like that.

You can see from the last two slides that Crazy Egg offers this ability to filter the click data—here it's filtered by referrer, but Crazy Egg also offers a number of other ways to filter the data, which we'll see when we take the tour of the tool.

So you can see that NREL has made a number of changes based on this Crazy Egg analysis.

Alright. Another question time. Does anybody have any questions? If you do, please go ahead and raise your hand.

OK, I don't see any questions, so I'm going to go ahead and move on.

The screen changes back to the presentation, with a slide titled "Crazy Egg is One of Many ..." and depicting logos of various Web programs, such as Google Analytics, WebTrends, and Coremetrics.

Crazy Egg is one of many different Web site analysis tools, but it's special because it provides a level of information that many of the other packages don't.

The slide changes to one titled "What's So Special About Crazy Egg?" and containing the below points that the presenter then reviews:

These are what I would consider to be the strengths of Crazy Egg as an analysis tool.

  • It allows you to visualize the click data, so you can see exactly where people are clicking on each page.
  • It's been called a poor man's eye tracker—eye tracking, for those of you who aren't familiar with it, measures where a user is looking at a page; Crazy Egg actually goes one step beyond that and measures an action the user takes on a page, and it's actually very inexpensive compared to eye tracking, which can cost thousands of dollars for a single study.
  • It is quick and easy to set up and run a Crazy Egg test, and the data that you get back is concrete and actionable.
  • And because it's so easy to run a test, Crazy Egg can be used as a comparative tool—you may want to run a test, make some changes, and then run another test to see if there has been an improvement.
  • Crazy Egg also collects real time data.
  • The price is right—this is a very inexpensive tool, and of course the best part of it for you all is that it is free to the programs because TAO picks up the cost.

The slide changes to one with the following information reviewed by the presenter:

And of course, as with any tool, there are some constraints on its capabilities.

  • Crazy Egg provides per page analysis only. Generally we run it on three to five pages per site, so you have to choose your pages carefully, and realize that you aren't necessarily seeing the full picture. Google Analytics actually has a new tool in beta right now that is very similar to Crazy Egg, but it actually allows you to mouse through and look at all of the data on every page. It's not quite as elegant as Crazy Egg. However, because Google Analytics uses persistent cookies, we aren't yet allowed to use Google Analytics on our sites.
  • If you have dynamic pages, you need to understand how Crazy Egg works in order to put the data in context. So if you have a feature rotator, for example, what you're going to see is where on the page people are clicking, but you're not going to be able to see exactly which of the features they're interacting with.
  • Crazy Egg offers the ability to collect information on new versus return users—but again, this is something we currently can't take advantage of due to the persistent cookie policy. So depending on how GSA interprets the cookie policy that just came out, this is actually something we may potentially be able to take advantage of in the future, which would be really exciting.
  • Crazy Egg will help you see where your users click, but it won't help you understand why they're clicking (or not clicking), so you really need to think about using Crazy Egg in conjunction with other usability tools that will help you get at the "why," such as a usability study.
  • And finally, there's a little bit of a learning curve in interpreting your data, so the first time or two that you use it, it's helpful to go through the data with someone who has some experience using the tool. So when you have your first set of results and you're ready to talk about them, just let me know and we'll set up an hour to look over your results together.

Alright, does anybody have any questions before we take a little tour of Crazy Egg? If you do, go ahead and raise your hand and I'll unmute you.

Participant 2: Do you happen to know when the cookie policy might be revised?

Narrator: I know that the cookie policy itself has been revised, but my understanding is that it was not as straightforward as we had hoped it might be. So because of that, actually GSA is in the process of interpreting what exactly that's going to mean for all of us. So that is what's happening. I don't have an actual estimate for that time period.

Participant 3: I was wondering, the analysis that you showed us—how long did the test run? In terms of, how long did you analyze data on that home page?

Narrator: I think that test ran for about a week. That's typically what we do, is we run it for about a week. The home page gets a lot of traffic. So you'll see when we go through the Crazy Egg tool, but you can set it either to collect a certain number of clicks or to stop after a certain number of days. We collected about 25,000 clicks for the home page.

Participant 4: Thanks for the question about cookies. We're still waiting for the federal Web metrics group to work with the folks at GSA, the WebContent.gov professionals, and what they're going to do is they're going to look at the guidance that the federal Web metrics group has put together, and then essentially offer all agencies some boilerplate guidance, which then the Department, and then EERE, will be able to adopt. We have been waiting for quite some time about that. We're waiting to maybe see if they can give us an updated timeline. We've asked; we haven't yet heard.

Narrator: Great. Thank-you. ... Does anybody else have any questions?

Now let's go ahead now and take a look at Crazy Egg itself. What we're going to do, is we're going to take a look at the old ITP home page, before it was redesigned. We ran a Crazy Egg test on it.

The screen transitions to a Web page displaying the Crazy Egg application.

Participant 5: I don't have a question so much as a comment. When you were talking about the EERE home page earlier, and you were showing how the Solar link is so popular, and you hypothesized that moving the Water link up to the top of that column would actually then generate a similar number of clicks. The way that we designed this is by looking at what some of the most popular topics are. We know that solar is a really popular topic. Water, not so much. So I just wanted to comment on the analysis part of looking at these clicks. And that's my take on that.

Narrator: OK, well, thank-you. I appreciate that. And you're right, it is a bit of chicken-or-egg kind of thing. If you designed it specifically that way, then that is what you would expect. You'll actually see in this snapshot that the phenomenon of having a larger number of clicks at the top of the list is actually something that's pretty common. We saw the same thing on the ITP site. I would expect that we'll be seeing it on additional sites, too. So I think you're right. I think in this case, it's probably a combination of the two different factors. So thank-you for that comment.

Participant 5: Yea, I think it's interesting to think about the connection and the causality and sorta analyze that way.

Narrator: Oh, definitely. And that's why it's really important for the people who are developing these sites are really the ones who are really involved in analyzing the Crazy Egg data for the sites. Because you guys know best about what's happening on your site and how you design things. Great; thank-you very much.

OK, so, to move along with our tour: This is called the dashboard. And you can see here on the dashboard how many visits we recorded and how many of those visitors actually clicked on something when they came. Here, for example, it's about 75% of people that actually visited the ITP site at that time but went ahead and clicked on something on the page. We can actually, as I mentioned earlier, we can set the profiles to run for a certain number of clicks or a certain number of days depending on what you're trying to test, and we will often collect data for about seven days so that we can basically get a whole week's cycle (here we actually did it for a little bit more). It's going to depend on the test, as to the specifics of what we set it as.

I'm going to go ahead and move over to view these results ...

The screen changes to a darkened view of the Industrial Technologies Program home page.

So this is the heat map view for ITP. Just a reminder that the bright colors show frequently clicked content. This view is useful to get a general overview of the activity on your page. So for example, on this page you can see pretty easily that anything with the word "industry" or "industrial" is very popular. You can see the search is also popular.

I'm going to scroll down the page. You can see that these features here are not being used fairly much. The feature rotator on this site wasn't being used much at all. And the content below the fold was actually not being used the whole time. There were some clicks in the right navigation, so that area was being used. I'm going to go back over to the overlay view ...

The view changes to the ITP home page with statistical icons at various linking points.

So when you're ready to dig into the data, this is the view that you'd probably want to look at. This report provides more information on the number of clicks that each item is getting. Again, these warmer colors represent more clicks, and the blues are less clicks.

For example here, you can see that, when I mouse over some of these, you can see that the top navigation is being used, but that the popularity of the buttons is decreasing from left to right. So you might want consider, for example, putting more important content bins toward the left.

Again, the items on the top of the list here are the most popular ones (the mouse points to the links in the page's center content, under "Save Energy Now" and "Technology Delivery"). The popularity decreases as you go down the list in all cases here.

One interesting point: On this site, as I mentioned, the feature rotator was not being clicked on; in fact, we only had four clicks on it. And we also saw that in user testing on this particular page, users just ignored the features. They were concerned about missing core information in the rotator. However, we did see people interacting with the rotator quite a lot on the EERE home page—so to me, the take-away here is that users are going to interact with different pages in different ways, so it's really important to look at how your users are using your site in particular. And also to test your site over time, to see how usage is changing over time.

OK, now what I'm going to do is I'm going to click on a red dot here.

The mouse pointer clicks on a red dot icon and opens a small window next to it, with a red background, depicting a pull-down menu and list.

So let's say you want more information, for example, about how the people that are clicking on this particular link, where they are coming from.

The options in the red window indicate the number/percentage of clicks; the date can be filtered, as well. The narrator selects "referrer."

You can see for example here that this particular link received almost 9% of the clicks. And what I've done here is I've filtered that data by referrer, so I can see for example that the majority of the people that are clicking on that link came from the home page or were direct traffic or came from Energy.gov. There are very, very few that are coming from external search engines that are actually clicking on that link. And you can do that with any of the links. For example, you can open these, and that kind of thing.

You can also take a look at the click data in the form of a table. And that's what this is here.

A long list in table form appears in the middle of the screen.

You can actually look at some hidden links as well as the obvious links on the page with that table.

And now I'm going to go ahead and click on the confetti view, which actually takes a little bit of time to load sometimes. Now the confetti view, like we saw earlier, provides the ability to filter all of the data. So as you saw, each dot represents a click.

The screen transitions to the darkened ITP home page, confetti view.

There are two types of information that you can get out of the confetti view. No. 1, it allows you to filter all of the data at once for a particular filter, and of course, again, you can also see exactly where people are clicking. So you can get to a finer level of detail than with the previous report.

So here you can see, for example, that people are clicking all along this blue banner (the mouse pointer moves over the banner at the top of the screen). We know that people are clicking on the banner, but we can see that they're clicking all along the blue banner. So it would be good for example to be sure that the whole blue banner is linked.

Say I wanted to see if there was a difference between where people were coming directly to the page versus where people coming from Google. I can actually go ahead and click on that here, and it will show you where people coming from Google clicked versus where people who typed, who came directly to this page, when they got here.

There's a number of other filters, as well. These are all the different filters. Some of them, like ... We don't collect persistent cookies, so we can't use the return visitor filter. These other filters allow you, for example, to look at time to click, which tells you how long people are looking at a page before they click on something, and where they clicked. Let's say, people that have looked at the page for a really long time, where do they click, versus people that come and click within a second, where do they click.

In the case of ITP, for example, we saw that people were spending a long time looking at this page before clicking, and that made us wonder why. So during the usability testing, we saw the same behavior, and were able to learn that people were looking for a long time because there was just so much information on the page and people were kinda overwhelmed.

Alright. Any questions before we move on? If anybody has any, please go ahead and raise your hand.

OK, I'm not seeing any questions, so I'm going to go ahead and go back to our PowerPoint slides. I'm going to skip through a couple of these, since we went ahead and looked at the live site ...

The screen shows a presentation slide titled "Getting Started" and including the information the presenter reviews below:

OK. So here's a list of the steps that you're going to go through when you're ready to get some Crazy Egg data on your sites. Go ahead and contact me at the beginning of the process, and I can give you some more details. But in a nutshell:

  1. The first thing you're going to want to do is identify your research questions.
  2. Once you've done that, you're going to determine the pages that you want to test. And of course, the research questions are really what's going to help you determine which pages you want to test—but as a rule of thumb, I suggest you test the home page, any pages that might have a different or a unique template (for example, second level with a right column, things like that) in case people are using those pages differently, any pages that you might plan to make a change on and maybe you want to compare the before and the after. You could do that. You could also choose a popular page, maybe a top entry page, a page with content that the program particularly wants to push. Or something that's in the media; you want to see if the media's driving traffic to a particular page. Where are they clicking. And you'll probably want to start out with maybe three to four pages.
  3. What you would do then is go ahead and add the Crazy Egg code to your pages and republish your pages. I can supply you with the code.
  4. Once you've done that, you just go ahead and let me know, and I will set up the test for you and give you access to your results, so that you can see them.
  5. The first time or two, as I said, that you run the study, just give me a call when you're ready for analysis, and we'll just set up an hour to look at the data together. That will just help to make sure you're getting the most out of your study. You may also want to write up a very simple report on your findings. If you do, please go ahead and send me a copy, so I have it. So everybody knows, I am collecting EERE usability project plans, materials, reports, things like that, so that other programs can use them, and also to make sure that we're applying our usability methods consistently throughout EERE.
  6. You're going to come out of the study with a list of changes that you may want to make to your pages. So once you make the changes, you can retest, if you want, just to be sure that the changes have the impact that you were hoping for.

The screen transitions to a slide that reads:
Wendy Littman
wendy.littman@hq.doe.gov
301-525-7521

When you are ready to start using this tool, please contact me—here is my contact information for anybody that doesn't have it already.

Does anybody have any final questions or comments? If you do, go ahead and raise your hand.

Participant 2: To add the actual Crazy Egg code to the page, a lot of us are restricted by RedDot. Do we have to contact EES to actually implement that code into the page, or can we enter it through the RedDot interface?

Narrator: No, the developers can enter it through the RedDot interface. It's basically, you would just go to the bottom of the center content, and click on over to the code view, and you'd just copy and paste this little piece of Javascript into the very bottom of the page, and republish.

Participant 2: So it doesn't strip out the Javascript?

Narrator: Nope.

Anybody else have a question, or comment?

Alright, then. I want to thank you for coming, and I hope you enjoyed the presentation. Stay tuned, because we'll be doing more of these presentations in the future. Thank-you very much.