Apologies to the avid Diary readers, I wasn’t able to do a second post yesterday as I ran out of battery and ran out of time after the Maroon 5 concert/After Party. Anywho…here are some highlights from the day that I didn’t go over in the earlier post:
1 – Matt Belkin went over the new Recommendations tool and I was pretty impressed. I hope that we can somehow utilize it on our site. Recommendations is basically a slick algorithm that can be used to change offers on the site based on previous behavior. The demo was based on a scenario where someone bought a dress on a clothing site on a previous visit and how the ads on the homepage on subsequent visits were dynamically served based on other people that had also purchased that dress in order to do some upsell/cross-sell. Some features that are interesting is that the Recommendations algorithm can also be configured based on different rules such as Topseller, People who viewed this product but bought this other product, People who bought this product also bought this other product, Product Affinities, etc. It also has controls for inventory, price controls, time frames, and placement. Cool stuff if you want Amazonize your site, and who doesn’t want to do that?
2 – Belkin also showed how the tools are starting to get more integrated with each other. The demo showed linkages between Test & Target and SearchCenter to do keyowrd landing page tests. I wish we used Test & Target, but we don’t. I wonder if the same thing can be done with the Optimost Genesis integration.
3 – I went to the High Tech breakout and there was an interesting tip that I hadn’t thought of for some reason where you’d take your Internal Search keyword and pass an s.prop or eVar as null:search term for those search terms that yield no results. For unknown reasons that never dawned on me. So if someone typed ‘Hamster’ on lenovo.com then the s.prop would be s.prop5=”null:Hamster”. I honestly have no idea why I chose the word ‘Hamster’, its just how I roll. By doing this you can then obviously look at those results and see if you need to figure out suitable pages for these words without a home. Additionally, by capturing that way you can enable pathing to see what users end up doing when confronted with no results in search. Also, a good strategy that was discussed is to look at your Natural Search Keyword, Paid Search Keywords, and Internal Search terms against each other to see where you have opportunities.
4 – Went to a breakout session on APIs hosted by Chris Wareham. Sidenote – Chris, sorry I didn’t stop by to say hi, I had to run to the next session. Recently, Omniture opened up its data platform to developers to take advantage of all the data captured in SiteCatalyst, Discover, and the other applications and set up automated feeds of that data into other applications and vice versa from 3rd party data sources back into Omniture. I want to spend some more time looking over how to take advantage of this functionality and I’m in the process of familarizing myself in XML and Web Services to actually code this stuff. I want to start linking together data sources that today don’t talk and do it in an automated way to take myself out of the equation.
5 – Went to the SEO analytics session. This was pretty helpful as we’ve been getting a lot more focused on driving more traffic via SEO (as is everyone else with declining marketing budgets) but dont really have the right metrics to understand how we’ll we are doing this. Some of the KPIs discussed were Revenue per search (by keyword), Entry page conversion rate (based on keyword), Conversions, Bounce Rate, Avg Time on Site, and Page Views per Search. The advanced stuff is where I got really interested. I’ve talked in the past about the Unified Sources DB VISTA rule (catchy name) which when implemented treats SEO as if it was a campaign so that it has the same attribution and shows up alongside of your other Campaigns. One of the drawbacks with this VISTA rule though is the tracking code for SEO is assigned on the backend which is unlike normal campaigns that use a parameter and a tracking code. The reason that is important is when you use the Campaign Stacking plugin it wouldn’t add SEO to the stack because it was missing a tracking code in the url query string (as it was being assigned on the backend of Omniture). Well, to my amazement there is a brand new plugin that solves it. Let me introduce you to the Channel Manager Plugin.
The Channel Manager Plugin actually creates a tracking code using the Natural Search keyword within the s_code.js. Because of that, you can now use it in the Campaign Stacking and have SEO keywords show up in the stack. Thank you Omniture. Now you just need a tool to actually visualize this data and classify it easier.
After implementing the Channel Mgr Plugin, you also get access to a few more reports that weren’t available before and help with measuring SEO. For one, you can now create a Natural Search Entry page report to look at clickthroughs, bounces, and success events based on a natural keyword and what happened by Entry point. This knowledge allows you to evaluate these pages to see if they are serving the needs of these folks and look at whether these are kinds of pages you want visitors to go to for certain keywords.
After doing Channel Mgr or the Unified Sources rule, you can now classify the SEO keywords into logical buckets using SAINT. So you could create groupings just like you have for Paid Search and compare and contrast. By segmenting by Non-Brand vs Brand you look to see what opportunities are there.
Another benefit is you can now create a Backlinks report that shows all the natural links pointing to your site and filters out all the paid traffic. This gives you a sense of the kinds of sites that link to you which factor into link relevancy which ultimately helps your SEO rankings.
7 – Martin Lindstrom, author of Buy-olgy, was highly entertaining and thought provoking…and I got a free book out of it. The premise of Buy-ology is that people behave irrationally when making purchase decisions. I have a lot of thoughts on this that might take me a while to get it all together in a cohesive post, also want to read the book. One really provocative idea was around “Smash Your Brand”. Can you take your logo off your website and would people still be able to identify your brand? Have you created something that makes you unique that someone could pick you out even without you telling them? Think about that for a bit.
did this pretty fast, so if I wasn’t clear on something, please let me know.