Time Parting and my own lack of time

I thought once I graduated that I would have this ridiculous amount of free time to blog away, write a symphony or two, solve world hunger, etc. Sadly none of this has come to fruition, hence the lack of blogging. Not sure where all the hours in the day go anymore, though work has picked up and I have tried to spend more time with family and less time writing.  That and its possible I have somewhat of a writer’s block at the moment. Lately, I’ve been fighting fires instead of solving problems and that might be why I’ve been so quiet. When I’m challenged with a new problem is when I get energized and end up writing about it. So hopefully some of the fires will smolder out soon and I can get back to trying new things.

Anywho…one thing that I’ve spent some time looking over and about to implement with some help from the Omniture Engineering crew is introducing Time Parting into my SiteCatalyst implementation. What is Time Parting? Basically it allows you to capture more granular elements of time for usage in reporting. Have you ever wanted to see how many widgets you sell at different times of the day? What time do instances of eVars and sProps happen during the day? Well, Time Parting would allow you to do it as you can basically capture a time stamp and use classifications of it. So you’d capture the time and date of say: “July 24,2009:8:00pm” and classify with SAINT to show that today is a Monday and the hour is 8 PM, or that the hour range is 8 to 9 PM or any way you want to classify it. You then would have access to reports and say “I wanna look at the last 52 Mondays” or “What does a normal Monday look like with regards to traffic and other metrics at noon”.

Those are all nice things to have, but here is how I plan to use it. Segmentation. I wrote about this a while back but Discover has one big weakeness when it comes to creating segments and its a ‘time’ dimension. You can create a segment of visitors that did this behavior, then did this other behavior, went to these pages, etc but you have no way to spell out when any of those things happened. So say I have a page that I’ve recently made changes to and wanted to look at those visitors behavior after that time frame, I can’t do it. But with Time Parting I can do that as I could create a segment that says give me all visitors that went to this page on July 24th and did this thing and then study that behavior over the next week. Or look at differences in campaigns on different days, or different promotions. So if someone comes in on Paid Search term X, how long until they come back again on a different term or tactic. It really opens up some interesting possibilities.

Quite frankly I think this kind of stuff should already exist in the tool out of the box, but its a VISTA rule if you are interested. Another wishlist item is for Omniture to do a better job of publicizing all of these VISTA rules so that the wider community knows they exist as a lot of times I stumble across them via conversions I have with folks but never see them in Knowledgebase or elsewhere. Maybe a VISTA FIESTA on the Omniture blog is needed.

OK, cool…writer’s block averted. Stay tuned.

Advertisements

A solution to backing out bad Omniture data…sort of

You’re in luck this week avid Diary readers. I am sitting about 30 feet from the Atlantic Ocean in the Outer Banks of North Carolina and its currently raining so I am inside writing. As an interesting aside, I found out that last week, Jimmy Buffet was renting the cottage I am. Would’ve been cool if I would have been here at the same time, I am sure he is a hell of a story teller. Probably doesn’t know too much about Web Analytics however.

I recently asked the masses to help me solve a problem with backing out data in Omniture. Unfortunately, the masses were silent which leads me to believe one of the following 1) no one cares 2) no one knows the answer. I am hoping #2 is the answer. So despite the silence, I’ve never stopped thinking about it and think I came up with the solution. Its not elegant but it works…sort of.

Since I am in the giving mode, I thought I’d share.

Step 1: You need to create an eVar to duplicate the value you capture via Purchase IDs. I called it Order Number.

Step 2: Create a SAINT classification for the newly created eVar Order Number. Make one of the columns ‘Status’ or whatever you want to call it.

Step 3: Export the SAINT classification for Order Number. Classify any order that was cancelled with something like ‘Cancel’

Step 4: Import SAINT classification

Now you have abilities to filter on things that are ‘Cancels’ or not. I’ve done it with some of our test data, and moved onto putting together a process to do this going forward. You can look at your Campaigns to see which ones are generating cancels. You can look at individual days and see how much was cancelled. I think the best use of it is actually with the Excel plugin as you could create data blocks with cancels (or without) and then use cell references to create cleaned up dashboards that would be filtered. Pretty neat stuff.

Pros of doing this:

It takes 5 min to put into action. Don’t need Omniture Engineering. Its a living table, so as things get cancelled or status changes you can change the SAINT file and doesn’t cause a ripple in the space-time contium

Cons of doing this:

Still doesn’t allow you to change the value of bad data (such as an incorrect dollar amount), just simply create a way to filter it out.

At the end of the day I still think Omniture needs a way to have Purchase IDs as a classification to easy move things around in a GUI like Campaign Manager as well as alter order data without having to go to Engineering. The idea is to make the tools as useful as possible and make it so your admin can actually adminster the data.

Graduation

This weekend marks the end of a journey that started in early 2006 with a challenge to myself to go back to grad school and get my masters in business. On Saturday I graduate. 

I can’t lie, it feels damn good. In some respects this process was to get me out of my normal element and try to learn more about myself. It might sound odd or sadistic, but by putting yourself through a series of challenges, you end up figuring out who you are. You see your strengths and weaknesses, you realize how to work with others to accomplish great things. People ask me all the time, “Was it worth it?” And I definitely would say yes for the simple reason that I think I understand myself better than I did when I started. Sure I got a more refined business education and so forth, but the stress of meeting deadlines, working with other people and their differences of opinion, and somehow keeping the rest of my life in order helps me see things differently. In fact, I think things have slowed down for me and I can see big pictures more than I could before. I look at things through a different lense based on this experience. All good things.

So now what? To be brutally honest I don’t know what I want to be when I grow up. I never have. I tell people all the time that I’ve had no plan to get where I’m at, so why start now? In any case, based on this experience I think I’ve put myself in a good position to take on whatever life brings and look forward to it. Now back to only having one job everyday.

Backing out bad data in Omniture

Hello blog…I’ve missed you.  How are you these days? I see you’ve accumulated quite a lot of dust since last we spoke.

Oh right. 

Anywho…so my post today is actually a plea to blogistan. And to some extent Omniture. Since I know people occassionally read this, here is my dilema. I have some bad transactions in my data that I need removed. For reasons beyond my comprehension our website had $800,000 Thinkpads for sale and somehow someone bought one or two. Truth be told they were just pricing errors, but somehow those transactions went through and are now sitting in my data. Honestly, if we’d just sell 3 or 4 of these $800k computers we’d be having a fabulous quarter, so maybe we should figure out how to do more of them. But in reality this is just bad data. Unfortunately, bad data is not easy at all to remove from SiteCatalyst. 

If you ask Omniture, they’ll tell you to create a couple of new metrics and import via Data Sources so that you’d have a metric that says something like ‘Cancelled Revenue’, “Cancelled Units”, “Cancelled Orders”, etc and then create a calculated metric to net them out. Unfortunately, that would cause a ton of work to redo about a billion reports and cause a ton of user education to not use the metric called ‘revenue’ anymore. Shouldn’t we be able to just remove or alter the offending records if needed? I think so. Maybe I am crazy.

So, here is where I need your help. I know someone, somewhere has created a Data Source that has taken out bad records via using negative numbers. I’d love to hear how you’ve done it. I created a Data Source rule that had the real metrics like revenue, orders, units and then loaded in the bad order id and a negative number but nothing happened. I know the reason that this method is frowned upon as it has possibilites to corrupt the database and open up a wormhole in the fabric of the universe (saw Star Trek yesterday) but I don’t care, I want the bad data out. It looks ugly. I’m a big boy, I’ll take the risk. I just want to figure out how to do it.

The other route I’d love to see taken is for Omniture to create a way to do this via a GUI in SiteCatalyst. Almost treat Purchase IDs and data associated with it as a classification to some extent that can be altered. I know that the reason this doesn’t exist (or at least I think I do) is because all the data is pre-processed into OLAP cubes for speed of delivery and redoing that stuff would cause reprocessing and slow the whole cycle down. But there has got to be a way to do this. Because right now the only true solution is getting Omniture engineering to back that out for you and that isn’t remotely cheap, especially if it keeps happening all over the world.

To repeat myself…if you’ve cracked the code on how to do this, please post….and I’ll owe you a beer (or 10) at next years Summit.  Lets see the Wisdom of Crowds in action!

The exile ends in 7 days

Last class in grad school is 7 days away, and the blogging exile will end. Back to the incoherence. Thanks for your patience and patronage.

Live from the Omniture Summit – Post #3 or 4 if you are keeping score at home

Wireless access at the conference was flaky, so didn’t get this out yesterday.

Some of the highlights from the breakout sessions –

Went to the Campaign Attribution session hosted by Mikel Chertudi. This session made me feel good as I realize I am not the only with troubles in attribution, its a huge problem in the industry.  There were some great tips in this session such as creating one version of the truth which I struggle with consistently. Your company likely has multiple agencies and multiple systems tracking revenue and conversions and nothing ever matches up. The key here was to make either your CRM system or Omniture to point of truth. I can’t agree more as all the other systems look at things in a vacuum. If you are doing reporting from DART for example, it only sees what is happening with banner ads. It has no idea that the conversions are actually occuring thru Paid Search or an Affiliate. Using something like Omniture ties them all together. 

I liked the best practice of setting the View Throughs for Post Impression to 7 days, as ours are set to 30 days now. The idea that people won’t remember an ad longer than a few days is probably a valid one and I think I might change our settings.

One of the things mentioned that we struggle with is whether to look at linear, last click, or first click as far as determining success. He accurately stated they are all valid, but every company needs to decide what is the way they want to evaluate success.  One thing I am not certain on is linear.  I didn’t have time to ask after the session, but Chertudi mentioned that linear brings in all campaigns.  So if you had 5 tactics that lead to a sell, the revenue would be split amongst the 5.  I don’t think that is actually the case in Sitecatalyst as I’ve found it only takes the last 2 and splits the revenue.  Now if the tactics all occured on the same visit, then yes it will show all 5 and split, in fact that happens with pages.  But with multiple campaign tracking codes over multiple visits I think it only takes the last 2.  Since I know Omniture reads this from time to time, if there is any clarification on this one, I’d love to hear it. My understanding is this is why the Campaign Stacking plugin was created because of the limitations with linear attribution.

A great tip on deciding between last and original is to measure them both and look at them side by side to see what the baseline is between them.  If they are roughly the same, just choose one. 

Another fantastic tip is to treat marketing tactics separately. Meaning treat internal promotions, remarketing, and external tactics as separate variables when tracking. We do that with Internal Promotions and External Promotions, but haven’t done that with remarketing.  I shall change.  The idea is that the remarketing would overwrite the credit to what got the visitors to the site to begin with before the remarketing.  The remarketing in some sense is similar to an internal promotion in some respects, just done outside of the site. 

The last session I went to was ‘What would you do with a couple of extra hours a day?’ which was energizing and inspiring.  The reason it was inspiring was Randall (didn’t catch the last name) from Electronic Arts presented on how he manages all the reporting and analytics across all of EA’s sites with really just 4 people.  I think everyone can relate to the problem of lacking resources.  In fact, I meant tons of other analysts in the same boat at lots of great companies some bigger than us. When I saw what he was doing with essentially nothing it gave me hope that I can get us farther along than where we are without having any new people (in fact we’ve lost people).  I think the elements that make him successful is the rigorous upfront planning he used to make sure the coding was the right way to help him measure the business.  Without a solid foundation all the automation in the world won’t matter.  I recently signed on to use Omniture consulting in an effort to help me clean up the implementation I did 3 years ago.  As I readily admit, I am not a coder, and I did a lot of it quickly and have likely made some mistakes along the way.  When I did this, best practices really didn’t exist.  So I am excited about the opportunity to clean this up and get my ship in order.  We struggle with creating a reporting structure that can reflect our global business and the hierarchy EA is using gave me some great ideas on how to reorganize our data and measure a global business. He’s set up quite a few a s.props and eVars to use for content categorization and use that to define his page naming schema.  I’ve left our page names in Omniture as the same names as they show up in the browser, by using the META title.  Unfortunately, Omniture has character limits and readability limitations in reports with longer page titles.  I think I can steal some of his naming taxonomy to make our page names shorter and more relevant.  Additionally, they’d become more stable instead of changing whenever the web team decides to change a page or we do something to help SEO.  Just seeing someone dealing with the same issues and organizing it in a better fashion gave me hope I can do the same thing.  Through in automation through Excel plugins and other things I have, I think I can realistically support all of constiuents without having headcount.  So I’m glad I saw that as the last session.

Twitter API into Omniture

Something really cool that I learned today is that there is a Twitter API that can be connected to Omniture’s SiteCatalyst giving you a way to feed all the Tweets about your brand, competitors,  industry, etc into an Omniture report suite. With some classification and filtering you could essentially build a database to mine all the Tweets and have it automated. The icing on the cake is you could set up a filter and then an alert on certain keywords or phrases and have it sent to people in your company that can take some sort of action. This is the beginning of creating an automated brand monitor using SiteCatalyst and the new open API architecture. Cool stuff.  And I just joined Twitter as well, so I can start posting one-liners.