Archive for the ‘SEOmoz’ Category
Posted by Justin Briggs
Hey everyone! My name is Justin Briggs, and I’m a SEO consultant at Distilled. A few weeks ago, I packed up and moved across the country to come to Seattle. Some of you might know me better as "seozombie" on Twitter. This is my first post on SEOmoz, but you can expect to see more from me here and on our blog at Distilled.
With the transition of Yahoo! to Microsoft’s Bing backend, webmasters have lost the ability to perform advanced searches using the link: and linkdomain: parameters. Rand Fishkin wrote a post about replacing the Yahoo! linkdomain: data with other data sources. Although Linkscape and Open Site Explorer provide a great data source, there is some functionality that Yahoo! had that isn’t present in other tools yet. The primary functionality I missed was the ability to perform searches against page content; not just page title, URL, and anchor text.
These link searches can help you identify link opportunities from other websites’ (such as competitors) backlinks.
Searching Content of Backlinks
To solve this problem, I setup a Google Custom Search Engine using data from Open Site Explorer. There are two exports of data you can use, which are links and linking domains. I’ll briefly go over the pros and cons of each as a data source in GCSE.
- Only search content that has links
- Less noise
- Limited to top links
- Limited to 25 URLs per domain
- Multiple links per domain reduces domain diversity
- Limited content (5,000 annotations = 5,000 URLS)
- Search all indexed content on a linking domain
- Find linking sources not included in OSE export
- Greater domain diversity
- More content (5,000 annotations = 5,000 domains of content)
- More noise
- Large linking domains like WordPress.com and Blogger.com have subdomains (lots of noise)
- Results that don’t have link
Setup of Custom Search Engine
Setup of your custom search engine is very easy. For this example, I’m going to use linking domains from OSE.
1) Perform search in Open Site Explorer
2) Pull linking domains for all pages on the root domain, export to CSV
3) Get list from Excel
I used Find & Replace to add a * to the end of all URLs, for matching. You can sort by DA or linking domains. Google Custom Search Engine only allows 5,000 annotations, so only copy up to 5,000 domains.
4) Create Custom Search Engine
Go to Google Custom Search Engine.
5) Perform your searches
So here are the pages on domains that link to distilled.co.uk, that include “link building” in the content and “resources” in the title.
This solution gives you a new way to mine for backlinks opportunities using your competitor’s backlinks. You can also include linking domains from multiple competitors at the same time. However, you can only include up to 5,000 annotations at a time, so you might want to use some Excel filters to remove noise and duplicate entries.
Here are a few quick tips to speed things up.
- Remove massive domains – Large domains like wordpress.com and blogspot.com can produce a lot of noise.
- Use the –site: search to reduce noise – If a particular domain is creating a lot of noise in your search, use a negative site search to remove it.
- Search brand mentions – A search for the brand can help find the linking pages on these domains.
- Search top anchors from OSE – Find the pages that include the anchors the site is targeting.
"powered by wordpress" "distilled"
Find pages that mention the brand “Distilled” and include “Powered by WordPress”. This is an easy way to find the blogs linking to Distilled.
“guest blogger” OR “guest post” OR “guest article” OR “guest column” -site:blogspot.com -site:wordpress.com -wordpress.org
Find guest blogging opportunities, but filter out domains that may create a significant amount of noise.
"powered by vbulletin" AND seo
Find vBulletin powered forums mentioning SEO.
“link building” intitle:resources
Find link building resource pages.
Give it a Try & Search SEOmoz’s Backlinks
A few queries to try:
"top seo tools"
“link building” intitle:resources
"open site explorer" "powered by wordpress"
Go ahead, try it, you know you want to!
I removed linking domains with a DA greater than 90, just to remove some noise from larger domains. (Selecting this value to filter by was completely arbitrary and is just to make the example easier to use.)
Need More Queries?
I hope this helps everyone replace some of the functionality of the Yahoo! linkdomain comand. If you’ve got more link searches or ideas to add, please share.
Posted by Aaron Wheeler
Video SEO isn’t something we always think about when optimizing, but we really should. In this week’s Whiteboard Friday, Danny Dover reviews some of the video SEO basics that every SEO should know about. After all, it’s a largely untapped market, unlike the Canadian maple tree market. Which is very tapped. (The Canadian maple tree video market, however, is quite untapped, but based on my scientific and extremely boring research in YouTube, I don’t recommend you pursue that market at all).
Anyways, we have a very special visitor this week, what with all of Danny’s meta discussions this month. Great Scott! That’s what happens when you get all meta and self-referential on us, Danny.
Hello, everybody. My name is Danny Dover. I work here at SEOmoz doing SEO. For today’s Whiteboard Friday we’re going to be talking about video SEO. Now, last week I mentioned that was the most meta video we’d ever done. It was optimizing SEO resources, right? Now, this one is a video on video SEO. So this one, this one is the new champion of the most meta video that we have ever done here, and possibly the most meta video that you have ever seen. If there is some kind of disruption in the space-time continuum, totally my fault. I apologize.
That was unexpected. That was Doc from Back to the Future. A poor impression of it. Totally derailing my Whiteboard Friday. You’re killing me.
All right. Now, video SEO, huge opportunity here. This is more of a serious thing. Video SEO has low competition. You see in the universal results that video thumbnails show up about a third of the way from the top, right. You’re seeing little thumbnails. A lot of times it’s YouTube, but you also see Vimeo and lots of other video providers showing up. You are seeing those in lots and lots of SERPs, and increasing so actually. There is a huge demand from people because, you know, Google is doing A/B testing or multivariate testing. They’re seeing people are clicking on those. But, at the same time, you’ll have low competition. You’ll see a lot of times for very high competition keywords that have video results that the video results will just be kind of mediocre. They just kind of showed up there. Part of that is because it is new. Not a lot of people are optimizing for video, which is becoming extremely important. So, a lot of opportunity there.
The other part of this, I guess I can only talk for the United States, where I live, but the way that people are starting to consume media is changing drastically. We’ve all seen YouTube. We’ve all seen Vimeo. Now the devices people are using and the places they are watching video are different. You have things like the iPhone, the iPad, and the iPayWayTooMuchForGadgets and I am an Apple fanboy, kind of thing. You’re seeing these all over the place. There is the Android model, the operating system that is running lots and lots of things. system. You’re seeing the way that people are consuming media very differently. The market is growing. Based on that, the demand is high but the competition is really low. Lots of opportunity. This smells like money to me. This is huge. This is a big deal.
How do you take advantage of this? Well, there are different metrics the search engines use to look at video content. When the search engines crawl normal content, they can get some kind of idea of what text is trying to say by using their natural language processing algorithms. They can get some idea of what this text says just simply because they put so much time and so much energy into developing these algorithms to get some kind of semantic feeling for what text means. Now, this doesn’t translate directly into video because, part of the reason at least, is video is much bigger files. It takes a lot more processing to get an understanding of it. It is a lot more zeros and ones. With these Google and the search engines have provided Meta information that you can do about a video.
The two most important ones here are the title of the video — what do you title your video. That’s probably what people are going to search for, right. If it is the shoes video on YouTube or whatever it may be on YouTube. Those are a lot of times what people are searching for. That information turns out to be very important for video SEO.
Likewise, the description is also very important because it gives you more than whatever may be the character limit, probably around 140, I would guess for the title. But it gives you more text to describe it in more depth. This helps the search engines understand the video without having to go through all the intensive video processing.
Now, as video SEO is maturing, we’re starting to see more and more metrics start to affect the algorithm. So, let me be totally straightforward with this. This is just my speculation. I have not done tests on these ones. But they seem very likely to be impacting the video search results. My guess would be that they’ll be more impactful going forward. So, they are something to start paying attention to now.
The first one I see here is engagement stats. The most obvious one here is views. How many times is a video viewed? I know that when I go to YouTube and I search for something, after I look at the text, the title and the description, I then look at the views. Has this been watched 30 times or has it been watched 10 million times? It seems very, very likely to me that click-through rates are going to correlate with high view rates also. So, I think views are becoming increasingly important and are something that you should keep an eye on.
Number two is ratings. So, on YouTube they offer a five-point scale. On things like Vimeo and other things, they use a thumb up and a thumb down. That’s more similar to the Reddit system. These are actual humans who are giving their opinions and their expertise on video content. This is very helpful because search engines are designed to provide results for humans. Any imput you can get from humans is helpful for getting output for humans. This is something that Google figured out very early and is something that is very important.
Number three, comments. What could be more human than commenting on videos? In YouTube’s case, it is some of the lowest thresholds of intelligence we’ve ever seen on the Internet, which is really saying something. You have floor chant, below that you have YouTube comments. It is kind of rough, right. But this is a metric of actual human beings engaging with content and with the author or producer of the video. This seems like a very important metric to me. I don’t think it is the content of the comments, because they are awful. But I think it is the volume of it and the kind of themes that people are talking about. Are they saying, "this is awesome" or "this sucks?" I think that does have some kind of impact on it.
The last one is social metrics. Really, I think this is universal. It is not just the video vertical; I think it is the other verticals as well. By social metrics, I mean things like the amount of tweets or what people are saying in tweets, Delicious popular saves, or submissions to Reddit or Digg or any of those other things. How are people talking about this with their friends? So, you have things like the QDF algorithm, which is Google’s Query Deserves Freshness algorithm. What this does is it will artificially inflate the ability for something to rank based on temporal metrics. So, if lots and lots of people are linking to something or tweeting about it, then it can artificially rank higher than things that normally wouldn’t just because it is very important. You see this a lot of times with natural disasters. Things will just rise to the top when normally they wouldn’t. Michael Jackson stuff. We saw lots and lots of QDF stuff really blowing, making things rank when normally there was no way they would. This is something to keep in mind also. These social metrics.
Now, duration. I think is the last one. This one is more about the extremes, finding the outlier. If a video is three seconds long, it is probably not something that Google, Bing, or Yahoo will want to rank highly. At the same time, if it is something that is multiple hours long, they might want to rank it, but it is probably not what people are going to look for when they are doing video. One of the things about video and content on the Internet in general is that people want to consume it quickly. They like bulleted lists. They like quick pictures, inforgraphic types of things, and they like short videos. I should probably take my own advice and get to the end here. So, I’ll try to do that.
The last one we have for you is tactics. I have expressed that there is a huge opportunity here. I have talked about some of the metrics that are important. Now, tactics, the search engines have given you several tools on how to do this. Video sitemaps is, not new, because video sitemaps have existed for a while, but the protocol was recently revamped by the major search engines and the people who are involved with that protocol. They’ve added a couple of things that are interested. They’ve added the location of the thumbnail of the video. They’ve added things like if it is family friendly or not. They’ve added the URL of where the video is embedded. So, from an SEO perspective, this is really interesting. We don’t want links going to YouTube anymore because YouTube has plenty of links. Instead, with the new video sitemap, you can provide the URL of where it is embedded and then when the search engines index that content they’ll link back to you. So, it’s not so much that you get a link from it per se, but you get the click-through. So, someone clicking on the SERP, clicking that thumbnail, is going to go to your blog, where you embedded the video, rather than to the hosting provider. This is a big win for us SEOs and for us content producers.
The other one is transcriptions. So, what could be easier than just going back and using the old tactics you already have for creating content? With transcriptions, you take video, you take the audio from the video, and you turn it into plain text. This is something that the search engines can then use and interpret just like they do a normal web page. This is important for search engines, but it is also important for human beings as well. People with hearing impairments who can’t hear this video right now can then go through and read it. They can understand it that way. International people who are speaking different languages can then go through the content and read at their own pace. They can do whatever tools they need to translate it. It helps spread it more. It is both good for humans and for users, which is a win-win and that’s always the situation I try to get when I do SEO.
I recommend that you always try to go for those win-wins, because ultimately what the search engines are doing is chasing after the idea of getting the best information to human beings. I think that’s what it really comes down to, crafting your content for human beings. It is harder to do with video SEO, but it is becoming more and more possible to do it.
I appreciate your time today. I will see you next week.
Video transcription by SpeechPad.com
If you have any tips or advice that you’ve learned along the way, or if you came back from the future, we’d love to hear about it in the comments below. Post your comment and be heard!
Posted by JoannaLord
Today I am going to talk about something that plagues companies and consultants everywhere–half baked analysis. It’s something we’ve all done at some point, and something a lot of us still do on a regular basis. It’s unfortunate because as online marketers we all understand the power of good data mining, but time and time again we revert to generic inquiry, at best, and default report templates.
Disclaimer: Origionally I attempted to write about the five steps I follow for solid data analysis in one post, but as I approached my 6th page of content, I realized it may be best to break up into a series.
Alas, this will be the first of three posts, tackling a five-step process toward good data analysis. The three topics are:
- Asking the Right Questions
- Identifying What is Going Wrong
- Turning Data Into Action
Yup that’s right…cancel that afternoon meeting because you my friend are giong to be stoked about data analysis in 3…2..1…
Rethinking the Questions
A few weeks ago at our SEOmoz PRO Seminar I spoke on "Analyzing What Matters & Ignoring the Rest" and I challenged the attendees to rethink the questions that guide their data research. Too often we get caught up in asking questions that simply put– don’t really matter. Let me explain. It will always be important to know things like "How much has traffic increased" and "What referrers are performing better this month," but this sort of inquiry does not qualify as marketing analysis.
Sure it’s valuable to report that to your clients or boss, but as an analyst you are tasked with much more. You are tasked with finding things others can’t. You are expected to dive into the data head first and find issues before they become huge problems. You are also responsible for finding opportunities a.k.a. the "game changer" for your company…that is your job. If you don’t like the way that sounds, please stop calling yourself an analyst. You are stressing me out.
So what questions should you be asking? Bigger ones to start.
I know they sound uber-top level, but don’t roll your eyes just yet. I challenge each of you to write these out and really think about the answers. I think you’ll be surprised with what you come (or can’t come) up with. I’m going to apply this to SEOmoz as an example.
An outsider would look at our site and say we are -
- Trying to sell PRO memberships
- An increase or decrease in completed goals would show us if we are being successful
- Losing traffic to our sign-up page, and a lower traffic count would be detrimental to our success
Well that is great, but honestly SEOmoz can’t succeed solely on increasing PRO memberships. The truth is, there is a lot more to it than that. We have a recognized brand with expectations on it, and a community of over 200,000 people that come to us for the latest SEO information on the web. We can’t afford to lose ground on either of those two. These are defining qualities of SEOmoz, and strong advantages over our competitors. So my three questions would leave me more complex answers, something like this:
- Increase organic traffic on "Learn SEO" type queries, increase branded term searches, increase YOUmoz member engagement, and increase signups
- More referrals from links to our resources, more traffic from people researching SEO, more YOUmoz submissions, more comments, improved engagement metrics on site, higher sign up attempts, higher signup completions, etc.
- Decline in branded term searches, decline in organic traffic to resource pages, decline in time on site for YOUmoz members, etc.
So now what? You are left with a handful of metrics to investigate. Those metrics should be the base of your analysis efforts. I urge all of you to revisit the reasons why you analyze what you analyze, you’ll be surprised to learn that you don’t really have a good reason most of the time. After you have your new questions nailed down and you know what metrics you want to analyze, it’s time to jump in the data.
Start Macro and Go Micro
This is when I highly suggest you fill your coffee cup, or grab another Red Bull. I also support locking your office door, or putting up a "Do Not Disturb, I am Data Mining You Silly Non-Analyst" sign up on your cubicle. Okay anyway…so the main roadmap to solid analysis includes five steps and they are:
*Please note that Analyze, Value, and Action will be covered in upcoming posts in this series.
What Do We Mean by Macro Analysis?
Macro analysis means you have a solid understanding of the different sections of your site, the different user types that navigate it, and the top-level metrics. You should know these like the back of your hand. In addition to knowing these actual numbers you should know their rate of change (how often does that data point change), the depth of change (how extreme are those changes–big jumps? small steps?), and the way they interact (is there a consistent relationship between two metrics–one goes up/down, the other will too). If this sounds like a lot to continuously track, you are right. Good analysis is a lot of work. Thankfully SEOmoz pays me in cupcakes, and Champagne Wednesdays, I highly suggest negotiating for these perks
At SEOmoz we track our top sections by week, so we can easily identify shifts in the data, and it looks something like this:
(A portion of our weekly analysis for full site stats)
You can see we aren’t just looking at our homepage, we are looking at our subdomains, our highest trafficked sections. We also are going beyond visitors, we are pulling top-level stats like pages/visit, time on site, bounce rates, etc. This graph goes around to the entire company once a week. This macro level view helps all of us understand the momentum of our site’s growth. It helps us easily isolate problem areas so we can address them before they grow into huge "Oh sh*t" moments. Trust me when I say, if you aren’t tracking your data at this macro level, you should start today.
What Do We Mean by Micro Analysis?
This part of the puzzle is the one that most people skip over. Micro analysis means you don’t just have a sense how your blog’s traffic is doing you know how many comments you get on it, how long they spend on it, how deep they go into your site after reading a post, and how many of your blog visitors end up converting for you. In short, micro analysis means you look at all those secondary data points that you can actually manipulate.
While it’s great to go into work on a Monday and say I want to increase traffic to my blog by 20%, it is a big feat to accomplish. Not only will it take a lot of time conceptualizing, writing and sharing that content, it will also, most likely, be less lucrative than if you took the existing traffic and increased its conversion rate by 5%. That sort of move is done by honing in on data at a micro analysis level.
Specifically this is where things like event tracking in Google Analytics and deeper dives into your preferred analytics package come in handy. Everyone has their own approach for micro analysis, but I think a good place to start is see where successful events (downloads, subscriptions, sign-ups, conversions, etc.) are taking place and see if you can come up with common demoninators. If you see that successful pages all have one or more thing in common, you can start testing these on other sections to increase conversions across your whole site. Here is an example of what we pull for SEOmoz:
(A portion of our micro tool usage analysis report)
We can see which tools are performing the best, and analyze those pages to see if we can isolate out page tweaks to roll out across all tool pages. It seems simple, but way too often analysts look into analytics to see how they are doing, and fail to put in the time required to uncover what they could be doing for increased success. You should know, for every single section and user type on your site, what makes it "successful." You need to be tracking these "successes" as closely as you would your visitor count.
Well this post got a little long, but I really wanted to give you guys some real examples on how I approach data analysis both at the macro and micro level. Hopefully, you can take some of this and apply it right away. I know we all have our own unique approach to analysis, and I’d love to hear yours in the comments below!
Next post I will be talking about the "analyze" step of a solid analysis strategy. That post will hone in on quick ways to figure out what is going wrong. I will talk about some GA features that you can use to make your analysis more effective and less time consuming. So stay tuned!
Posted by randfish
Ugh… Part of me just wants to link to this old blog post and leave it at that.
But, since there’s actually a bit of data to share helping to show that (at least so far) Google Instant changes less than your average algorithmic rankings update, let’s share.
880,000 Search Visits Analyzed
Conductor released some nice research from anonymized data of sites on their software platform making a compelling case:
If Conductor keeps putting out this kind of stuff, they’ll be a "must-read" in no time
Hmm… Looks pretty darn similar to me. A tiny increase in 4, 5 and 6 word phrases would seem to go against many of the prognostications and fears that this move would decimate the long tail (though, to be fair, plenty of savvier search folks predicted a slight increase as Google’s "Suggest" function would be more obvious/visible to searchers and push them to perform more specific queries).
Google Search Traffic for SEOmoz & Open Site Explorer
While I don’t have as much data to share as Conductor, I can show you some tidbits from SEOmoz.
Here’s SEOmoz.org’s traffic from Google in the past week compared to the week prior:
And here’s a similar look at OpenSiteExplorer’s Google traffic:
There’s a suspiciously small amount of change in the keyword demand, and although these are certainly un-representative of the broader web, we can be relatively confident that lots and lots of folks in our industry, performing queries that might lead them to these two sites, have awareness of and are using Google Instant.
One change that did catch my eye (thanks to some Tweets on the topic) is that Google’s Suggest itself seems to have changed a bit:
Hard to complain about that
Other Sources Worth Reading on the Topic
I was a bit dismayed to see so many in the SEO field taking this as a serious threat or even touting the massive "changes" that would be coming soon to SEO best practices or even search query demand. We’re usually pretty good about shrugging off Google’s pressbait around technical changes that don’t have much of an impact, but this one seemed to have more legs than usual.
That said, there are a few pieces I think warrant a read-through (or at least, knowledge of):
- Google Instant was Actually a Counter to Bing’s Advertising Blitz from SearchEngineLand
- Google Undertand Not Google Instant from Michael Gray
- 3,000+ Mashable Voters Prefer Google Instant (Slightly) from Mashable
- Sad to link to this PCMag Retraction Showing Their Tragic Ignorance of Search – they confused SEO and PPC, thought marketers might need to buy ads on partial words ("flow" for "flowers"), etc.
- Explanation of Ad/Query Triggering (basically it’s 3 seconds of inactivity) from AdWords
- The Complete Google Instant Users Guide from SELand
- A Great Image from SEOBook showing how Instant further marginalizes the organic results
Very much looking forward to the discussion, but I’m leaving for Social Media Week Milan and will be hard pressed to contribute at normal levels until my return next week. Until then – Buona notte!
p.s. If you have data to share on how Instant has or hasn’t impacted your traffic-driving queries, that would be awesome. If you blog/upload it, we’ll be happy to update the post with links.
Posted by Jamie
Today I’ll talk about one of my favorite topics, Conversion Rate Optimization (or CRO). I won’t be speaking about tools, case studies, or tips on what layouts or buttons colors work best; Dr. Pete, Paras Chopra and Oli Gardner have written some excellent blog posts on these topics recently. Instead, over the next several weeks, I’ll be posting a few lessons I’ve learned from doing CRO successfully (and unsuccessfully) for a variety of organizations. These are things I wish I had known when I got started.
Today’s post will focus on how to convince your organization to do CRO.
Make the Case for CRO using Simple Math
CRO may be popular on online marketing blogs, but I’m always surprised to learn that most organizations aren’t doing it. At the recent SEOmoz PRO Training Seminar in Seattle, conversion rate guru Tim Ash asked the audience how many of their companies were doing CRO. Of the 300 or so in the audience, only a few dozen individuals raised their hands. Of all the things I’ve worked on in online marketing, nothing has delivered a higher ROI than conversion rate optimization. And yet, it remains less popular than it should.
One explanation I’ve heard is that it’s difficult to get started. But with free tools like Google Website Optimizer, and affordable, yet capable services like Unbounce and Visual Website Optimizer, this excuse is quickly losing ground. The best explanation I can venture is that CRO doesn’t happen because it’s difficult to prioritize against the stack of urgent projects that marketing teams tackle each day.
Your first job should be explaining the potential return-on-investment of a CRO project. If your marketing team, boss or client knew the estimated ROI of CRO using metrics from their own business, they’ll be more likely to prioritize it ahead of other projects. So what’s the best way to make the case for CRO?
Use simple math. Take the numbers of conversions/goal completions from key process of your website, and show what would happen if they performed better. Imagine saying this to your boss or client:
The above example was generated using a simple Excel spreadsheet I created. Download the worksheet and just fill in the white cells with blue text (further instructions are later in this blog post). The spreadsheet will calculate a simple ROI and provide an easy, yet surefire argument.
The boxed quote above reflects the outcome of a retail web site example that has 632 sales a month with an average transaction size of $40. See the details in the screenshot of the spreadsheet below:
What to enter into the spreadsheet:
A friendly name for the User Experience you are considering optimizing using Conversion Rate Optimization. For this example we are using the Checkout Page of an typical retail e-commerce website.
I’d recommend the number of total Visits (for an average month) to the first page of the user experience you’d like to optimize using CRO. In this example above, this is how many Visits occurred on the checkout page of a given month. I believe Visits are better than Unique Visitors as they take count someone who visits twice during the same day as two distinct visits. I wouldn’t recommend using Page Views in this cell, since page reloads and other behaviors can make this number larger than it should be.
The monthly conversions or successful completions to this user experience. In this example, the number of times a purchase was made from the Checkout page. For simple websites that have a single purchase experience, this is usually an easy number to determine. If not, make a best guess.
Average Cash Per Conversion
This is how much money you make on average for each conversion that is completed. An optional, but desirable field. A monetary estimate makes for a more compelling argument. For the example above, the company makes an average of $40 for each transaction. If you are a subscription business, this is where you would enter your customer lifetime value.
If you don’t have easy access to monetary values like average purchase size or customer lifetime value, just use the raw number of conversions to make your case. Using the data entered above, that would be the following (note that the Excel worksheet provides both):
Conversion Rate Increase
The estimated improvement that might be achieved using Conversion Rate Optimization. What percentage increase should you use? It’s up to you, but I like to estimate 10% improvement, because it’s believable and if your user experiences are not already very well optimized, this percentage is usually easy to achieve. But in my experience, if executed well, your first test will do, much, much better.
Keep it simple.
This is a simple ROI calculation. Some may argue it’s too simple, but it makes a compelling argument that’s easy to grasp. The key lesson here is while 10% may not seem monumental, when you see the expected ROI, it often is. And for a low effort with a big reward, it’s a slam-dunk. Use simple math to make your case and you’ll have a better chance of getting your organization on board with conversion rate optimization.
What’s worked for you?
What’s helped you convince your organization or client to start doing conversion rate optimization? Please let me know in the comments!
Jamie Steven is the VP Marketing at SEOmoz, and a lover of pumpkin-flavored beverages including lattes and beer—both excellent choices for chilly fall weather.
Posted by Sam Crocker
Hi there folks!
Today we are going to take a look into Foursquare and, more specifically, we’re going to check out how to use it for events and conferences and uncover some of the answers to the questions that aren’t as easily avilable through the Foursquare site.
Quick Background on Foursquare
So, we’re not going to waste too much time on an introduction to Foursquare, because hopefully you’ve already been focussing on ways to incorporate this into your marketing plan. The implications for any business with a storefront or actual address are fairly straightforward, though the implications for online brands are a bit more difficult to tap into.
It’s Not just for Stalking! (Image via: Geek and Poke)
I had originally prepared a post dedicated to Foursquare and its impact on local and small businesses, however it seems SEO Doctor was one step ahead of me and produced this impressive guide before I was able to get my post out of the drafts folder here on SEOmoz. His post is extremely comprehensive though, so be sure to check it out!
I’ve been hearing loads of people talking about how "they don’t get it" in reference to Foursquare and it’s worth pointing out how many people were having trouble understanding Twitter as well. I would definitely recommend familiarising yourself with Foursquare now – especially if you work with any local companies.
The growth of Foursquare, Gowalla & Facebook Places has been extremely convincing, and the limited number of people making use of the "Specials" available by claiming your local business (for FREE) with Foursquare seems like an obvious missed opportunity – ignore Location-Based social media at your own peril.
Using Foursquare for Conferences and Events
At any rate, as you know, Distilled and SEOmoz have been busy over the last several months preparing for the #mozniar and the PRO seminar in London ( for which I would be remiss not to quickly let you know that tickets are still avaialble). In this preparation we have been looking heavily into ways to spice up the event.
Given my mild obsession with all ways to earn seemingly meaningless points and my new found hobby of Foursquare Roulette (jump off a tube station and randomly try whatever looks entertaining in the area) I proposed we look into Foursquare and what sort of things we might be able to do with it for the conference.
If any of you are as nerdy as I am, then I’m sure you will have noticed how some of the biggest brands as well as some of the largest events in the tech and music industries have been able to get their own Badges you can unlock by checking-in at various locations.
Screen Cap from Tony Felice
The first thing you’ll notice is that these are not small affairs and there are potentially obvious reasons why these clients were potentially able to strike a deal to get a badge. You might also notice – if you’ve looked into this previously – that it can be fairly difficult to find any information about how these deals are struck, and it can be equally difficult to get in touch with the folks at Foursquare about striking up a deal.
Getting to the Source – An Interview with Eric F.
After enough prodding and digging through my own social networks for any potential "in" I consider myself very fortunate to have been able to get in touch with Eric, who happens to be the Director of Client Services – and was incredibly helpful and happy to speak to us.
Rather than be selfish with the responses I thought I would provide some of the answers he provided to my most burning questions about it all. Here are the responses I was able to dig up, I’ll include a brief recap of the implications afterwards:
What would be your top tip(s) for making the most of Foursquare for conferences and Events?
EF: Setting up goals ahead of time is the best way to plan for a conference. You may have a single-day event and encourage people to check-in early, or you may have an event spread out over a few days or weeks and have people check-in early, in the middle, and at the end. We look at foursquare as a flexible platform, depending on event planners’ needs. Some folks have found success with contests or tips to visit different booths. Others use foursquare as a way for attendees to connect and to see who else is at their event.
One recent conference used foursquare to show which events had the most people attendants, and then gave the speakers a chance to connect to their audience via Twitter after the fact.
There is a very robust API available as well that give event organizers the ability to show live check-ins and other interesting data about the event in real time.
In the past we’ve have seen badges from events and conferences (e.g. CES, Bonnaroo, SXSW, etc.) in the past- how does that work? Is that a service that people pay for? Is there generally a threshold about how "big" or "cool" an event is? Or is it more just about getting in touch?
EF: We’re still in the early days of this, and have been testing different approaches around partner and event-related badges. Sometimes, we choose a venue because of a cool use of the platform; other times, it’s to reach a new audience
In the future we hope to roll out a more structured plan for event planners and conferences – but for now we are inspired by the ideas and implementations we have seen from these events.
Is this a market Foursquare has considered? There seem to be loads of conferences and events and it seems like partnerships (with badges and such) could be a real opportunity.
EF: We are concentrating on the best user experience possible. If this comes at events and conferences, we are doing our job right.
What things can/should any event organiser do with Foursquare in the short-term? Obviously there is more to be done for a massive festival or conference, but what about one-off events or smaller time affairs?
EF: We look at our loyalty offers (in the form of special offers and mayor offers) as a big win for anyone with a physical location. These reward people that go somewhere for the first time, or are loyal customers. This also lets merchants track success with redemptions and foot traffic.
We know that business accounts are free, but how do your partnerships work with larger brands? Is there a general price range on these? How much does it cost for a brand to get their own badge? What if they want more than one?
EF: All business partnerships with foursquare are totally free. This includes someone with a single location such as a bar or restaurant, to a national retailer with 10,000 locations. To be 100% clear, we offer the ability to see analytics, run specials, and interact with new and loyal customers totally right now.
Badge programs have either a monthly cost associated with them that is directly tied to promotional consideration and reach, as well as the longevity of the campaign.
Who should large brands try to get in touch with if they want to team up with Foursquare?
EF: We have a dedicated support area for businesses: http://support.foursquare.com/forums/177952-foursquare-for-business
This ensures that the proper person will be able to answer the proper question whether it comes in from a local merchant, large chain, agency representing a brand, event question, or anything else that may arise.
What about smaller brands?
EF: Same as above – funneling requests through one system ensure that someone on the team gets back to people quickly, correctly, and promptly.
Finally, any previews/things in the works for business/marketing uses of Foursquare you’re willing to share?
EF: Knowing where events are happening, or where people are gathered, is a great metric of discovery. We’re all about letting folks know when something is happening, and most importantly where it is happening. We are looking at ways to empower users and businesses by giving them this knowledge at their fingertips.
Making Sense of it All
No surprise that the Foursquare team are keeping some of their cards fairly close to their chests, but there’s definitely some key takeaways from this.
1. You don’t have to be a global brand to get the hook-up. It seems pretty clear that any creative uses of the API are a definite way to grab attention from the folks at Foursquare, and is potentially a clever way to get your own badge.
2. There is no doubt that Foursquare and other location-based social media platforms are growing and now is the time to make sure that if you work with any local businesses: get on the ball, get your venue registered, and go to town. I would not be the least bit surprised if in some fashion or another this sort of data (rankings, tips, check-ins, etc) becomes quite valuable to the team over at Google when it comes to looking at local ranking factors.
3. If you decide to make location-based social networking part of your plan – let people know! There’s no sense building the most incredible API to date to be used at your event, venue, etc. and not letting people know about it.
4. Even if you can’t get your event/conference its own badge there is still plenty you can do to engage Foursquare users.
Examples for Short Conferences
- Be sure to set up your venue(s) as locations
- Create multiple venues for the same location (e.g. "Conference Room 1" "Bar" "Exhibition Hall" etc)
- Rewards for check-ins (forget about Mayor’s – focus on the short term)
- Make use of Existing Apps. Check out ScreenScape, LocaModa, 2Know and if you’re in London tell people to try out FourTap
- Create a new App
- Encourage early check-ins and sharing via Twitter
- Splash some cash and get your event a badge
Examples for Longer Conferences
- As above
- Have incentives for multiple check-ins
- Encourage check-ins from multiple venues
- Offer a prize for the mayor of the conference
Where exactly we end up along the spectrum of "things you can do" for London PRO for this year is still a work in progress, but you can bet I’ll be championing for meaningless points and our own spin on the thing – and you can be sure we’ll let you know what we come up with.
A very big thank you to Eric F. and the Foursquare team for taking the time to answer our questions!
Please let us know your thoughts below and any successes/hiccups you all have had using location-based social networking in the comments section below.
Posted by Aaron Wheeler
Hello everybody! My name is Aaron Wheeler and I do customer service here at SEOmoz; if you call us or email us, there’s a 50% chance you’ll end up talking to me. Oh well! Your loss is my gain. =) Anyways, one of my new tasks in the office will be video production so you may end up seeing my gob around the blog every once in a while. I’ll be the main one posting these Whiteboard Fridays in the future as well as some of the other glorious cinema we create to vitalize your ears and eyes. It’ll be fun! If you have any feedback or ideas, I’d love to hear them; you can reach me by email or twitter at my contact page: Aaron Wheeler. Nice to meet you!
This week, our very own Danny Dover discusses some important and scalable ways to optimize your SEO resources. We all know that pickins’ can be slim when it comes to many companies’ budgets for SEO, so why not make the best of what you’ve got? Danny has some ways that you can make the most bang for your (and your boss’s) buck.
Hello, everybody. My name is Danny Dover. I do SEO here at SEOmoz. Today we have something a little bit special. We’ve bought all new equipment and new microphones. We heard your comments in the blog posts that our sound quality was a little bit "meh." So, we’re trying to make it a lot less "meh." So, please give us your continued feedback in the comments below.
For today’s Whiteboard Friday, we’re going to be talking about optimizing your SEO resources. So, according to my research, this is the most meta SEO Whiteboard Friday we’ve ever done. We have optimizing and then, of course, the O in SEO stands for optimization. So, if there is some kind of time warp or something that goes on, just expect it. It is kind of the things, part of the downside of this job is that sometimes you disrupt the universe. Oh, well.
So, I’ve broken this down into three categories that I recommend.
1. Define Goals
The first one is define goals. Just like self-help books, goals are very important, right? That analogy didn’t work per se. Maybe I need to read more self-help books. That would be a good idea. Define goals, right. I have broken that down into three subcategories.
Find your highest ROI customer. This is a little bit counterintuitive but it makes a lot of sense. I recommend doing this first. If you have an established website and you’re trying to optimize your SEO resources, you’re already going to have some data on who your customers are. Let’s say you are a newspaper website. I’m sorry, first of all. Hard times for you, but good luck. So, if you’re a newspaper website, you’ve got to figure out if it is your politics readers who are going to make you more money or if it is going to be your sports reader that are going to make money. Then based on this information, you get this from your analytics and some Excel stuff, figure out what you can do to target those customers specifically. So, really maximize the money you are already getting. So you have these resources in place. Make sure that you are getting the most out of them. That is kind of the key to optimization.
Identify your budget. They say the creativity is limited by, or creativity is dictated by limitation. The Google homepage is always the example I hear about this. Although, if Google is watching, you’re kind of getting overboard lately. The one where the balls went flying everywhere annoyed the heck out of me. Please don’t do that again. Identify what your budget is. This is going to dictate everything that you are able to do. Are you going to be able to hire on a whole team of content writers? Are you going to be able to get SEO consultants onboard with you? How are you going to do all of your web development stuff? It’s all dictated by budget. So, know what that is going forward. Get it on paper. Make sure you know what this is. Make as elaborate a budget as you can upfront so that you know what you are going to be able to do going forward.
Develop a Content Strategy. this is the one where I see mistakes made the most times, myself included. I mess this up all the time. Develop your content strategy. The key to SEO — and you have heard lots of talking heads like myself talk about this in SEO spaces — is that content is the key for SEO. That’s because when people go to search engines, be it Google, Bing, Yahoo, or whatever it may be, they’re going there to find content, right? That’s the purpose. They want some kind of question answered. The key to SEO is content, building that thing that Google is going to want to index and provide in their search results. You need to figure out how it is going to happen. Who is going to be writing these contents? If it is a blog post, is it going to be Jamie from marketing, is he going to be the one who is going to write it every week? Is it going to be every week? Is it going to be every day? Do you have a signed contract from Jamie saying he’s going to do this? What happens if he is out on vacation or something? How are you going to get the content produced every week? Who’s going to write it? Who’s going to edit it? How is it going to get published? You need to figure out these details early, as quickly as you can. Get them ironed out on paper.
2. Calculate Impact vs. Effort
Calculate impact versus effort. This one is kind of core to optimization. Figure out what are your lowest hanging fruits, this is the first one. I found the best way to do this is using, this is super self-promotional here, using OpenSiteExporer.org, which is a SEOmoz product. It’s free. You don’t even have to sign up to use the basic version. But with OSE, with Open Site Explorer, you go in there, put in your URL or your competitor’s URL if you are really clever. Click on the tab that says top pages, and it will show you all of the top pages by links, so which page has the most links to it. It will show you the status codes. So, if it is like a 404 error, it means you have links going to that page, but you are not getting any SEO value from it. The same thing with a 302 redirect. If it is not a 301 redirect, it is not going to help you from an SEO perspective if it is a 302. These are links that you already have. You’ve already done the effort to make these work, but you aren’t getting any benefit from them. They’re the low hanging fruit. Again, that is OpenSiteExploreer.org.
Meet with stakeholders. I would say it is about three months ago now, we did this at SEOmoz. We brought in all the heads of the departments here and then a couple other important influencers for the company. We put them all in one room. We were like, "Okay, what are everyone’s top priorities?" What do they want to see happen in the year to come at SEOmoz? We wrote our suggestions on sticky notes, put them up on the board. It was not surprising that they varied by department. I am in the marketing department here and mine happened to be marketing goals, whereas the developers wanted some more back end things to happen. The bus dev people wanted more, like, "We should make more money. That’d be a great idea, right?" That’s why they’re the bus dev people. Then we had operations who were doing other things like that as well. We put all of these on a whiteboard, discussed them all, and then voted on them as a team. Based on this, based on how much effort it is going to take from the marketing department, how much benefit are we going to get globally for the entire company as a whole? We found that this exercise provided a lot of value for us. It is actually the roadmap that we are using today.
3. Document Repeatable Processes
Document repeatable processes. This is kind of self-explanatory. In SEO, there are lots of tedious projects you have to do. Let’s say it is link building. You’re going to go do the keyword research, figure out what anchor text you want to target, then you are going to go through and find out what the relative link sources are for that. You’re going to contact the right people and ask them if you have a template probably. Or you’re going to do some kind of, build some content so it can attract the links naturally. That’s the way I like to do it, just as a side note.
With these, whatever your process is, whatever you find works for you and your organization, document the processes. Write down every single step. I do this for two reasons. The first one is so that I know I am not missing a step when I go through this. A lot of times when I have done a process for the umpteenth million time, I skip a step just because I am human, I get bored of it, and I stop paying attention. But if I have a checklist in front of me, I can go through and make sure I don’t do it. The other reason is for scalability. If you can take this process and hand it off to more people to do the same process as you while you are doing it, then it is going to scale, right? You’re going to get more throughputs on this process. I have found this to be extremely successful here and especially when I was doing SEO work with clients in the past. If I push this off to other people who are working for the company who are doing similar things, we can maximize the amount of impact we get with minimal effort from the people involved. So, it’s fantastic.
That’s all the time I’ve got today. I look forward to hearing all your comments below. Thank you. I’ll see you next week. Bye.
Video transcription by SpeechPad.com
If you have any tips or advice that you’ve learned along the way, we’d love to hear it in the comments below. Post your comment and be heard!
Posted by randfish
In the last year, there’s been a plethora of entrants to the field of link building services outside the traditional software basis of reversing competitors’ backlinks (like our Link Intersect, LAA or Open Site Explorer tools) and consulting/direct purchase. In this post, I’ll try to cover some of the interesting major new services, as well as present some long-standing options that some SEOs may not have discovered.
I’ve segmented the services below into unique sections to help differentiate the types of link building they offer. Some are more service-based, others are pure-software and the first section is more visibility-based than direct link acquisition.
One of the more unique offerings in the last few years, Zemanta lets publishers submit a feed of content or images to them, which then appear in front of bloggers in the "composition" window (while they write their posts). These are labeled as "related posts" and have multiple benefits:
- They can improve branding amongst a blogging audience (as bloggers will see your site/brand name while they write)
- They can draw in direct links (if the blogger chooses to link to your work in the post or as a "related post" at the bottom – or through links from image references)
- They can attract direct traffic from the bloggers themselves, who are likely to click on links/content that appears to be interesting
You can try Zemanta’s service via a demo on their site
Zemanta has (according to their team) been formally approved by Google’s search quality folks as a white-hat service (which makes sense since all they’re doing is showing advertising content to writers, who then determine if they want to link or not) and is now included in WordPress and Blogger.
SEOmoz has been using them for over a year now (we started with a trial and continued on) and we’ve seen good results – we tend to get a half dozen or so links to our content (the blog and YOUmoz) each month which can be seen through their reporting system (which has some upgrades in the works).
*Other than our paid use of the service, SEOmoz does not have any affiliations with Zemanta or its founders.
Founded by Ann Smarty, MyBlogGuest provides a platform for those seeking to write and receive guest posts. The service is relatively simple, but potentially quite powerful. If a reasonable number of quality blogs and sites participate in the marketplace, the opportunities for providing great posts and receiving traffic and links back are tremendous (as are the opportunities for those seeking more content and relationships).
Blogging is an inherently social field and while the links may be a primary driver for many interested in the site, Ann has made it clear that she hopes deeper relationships will emerge from the connections. The site’s layout and signup process are impressive and compelling, though driving action once inside the platform could still use a bit more polish.
You can read more about the project in SearchEngineLand’s interview with Ann from February.
I’ll be surprised if some Silicon Valley style startups don’t pop up to copy this model. Hopefully Ann can stay far enough ahead of the game through a network effect to remain compeitive. It’s a terrific idea that needs only enough branding and awareness in the space to take off.
*SEOmoz does not have an affiliation with this site, though we have contracted Ann, personally, to do projects for us in the past.
Originally known as Enquisite, EightFoldLogic, a software company with offices in Victoria BC and San Francisco has recently launched a marketplace of their own for website owners of all stripes called "Linker." The premise is similar to MyBlogGuest, but the audience is wider and the interface more customized for creating one-to-one, private connections.
Linker enables the creation of "criteria" much like personal ads for linking connections
Within a day of signing up in a single category, I had four potential "matches"
Linker’s goal is to connect sites and marketers interested in partnerships or link relationships with one another. Since their service ends at the time of connection, the method of obtaining the link is up to the parties involved. This means plenty of white hat options, but also potential gray hat ones – however, EightFoldLogic’s Richard Zwicky and the audience they’ve traditionally attracted lean white hat, so I expect this won’t be an issue unless the audience changes substantially.
The concept of marketplaces for link acquisition and connecting to site owners interested in links is a compelling one, but the key, as with MyBlogGuest above, will be achieving the critical mass of users necessary to make the service valuable. To that end, Linker’s made their product completely free for the next couple months – you can sign up here.
*SEOmoz provides link data via our API to EightFoldLogic but does not have a financial stake in the company or this product.
Whitespark’s Local Citation Finder
The concept is to find sites that are included in Google Local’s "sources" for maps and local review data that link to or reference multiple sites that rank in the local results. It’s a simple idea, but well executed and incredibly useful for those seeking to optimize their local listings. You can try the Local Citation Finder here – results take just a few minutes to be returned.
Enter some data about your site/goals and the citation finder will email you potential sources for listings
As the local results grow in importance and competition, and as the value of having these consistent, multiple listings rises, I suspect this tool will be incredibly popular. I’d love to see further productization around showing more data about the importance/value of particular local listing sites, and some opportunities to help control and manage those listings, but this first version is pretty exciting on its own.
*SEOmoz does not have a financial or product relationship with either WhiteSpark or Ontolo, though we have been talking to the latter about use of our API in other products.
Although there are dozens of other services I’d love to cover, these are some of the most interesting to me, personally. As always, looking forward to your thoughts and recommendations, too!
Posted by bhendrickson
LDA is remarkably well correlated to SERPs, but by substantially less than I thought or claimed. Expected correlation (as measured by expected spearman’s correlation coefficient over our dataset) is 0.17 instead of 0.32. I found a mistake with the calculation that produced the 0.32 score.
0.17 is a fine number, but it is awkward having previously claimed it was 0.32.
Statements I made in the past two weeks along the lines of "LDA is more important (as we measure it, yada yada) than other ways we’ve found to evaluate page content, and even more surprising than any single link metric like the number of linking root domains" are incorrect. A corrected statement would be "LDA is better correlated (yada yada) than other ways to measure page content relevance to a query that we’ve looked at, but less correlated (yada) than several ways to count links."
Topic modeling is still another promising piece of the pie, but the slice is not as large as I thought. Or claimed.
Slightly long winded description of the bug and what evidence there was of it:
I was looking into the discrepancy between Russ Jones’s chart, which showed roughly a linear relationship between SERP ranking and sum LDA scores, and Sean Ferguson’s chart, which showed a huge jump for the mean LDA score but the rest pretty random. Russ Jones had based his chart off our tool. Sean based his chart off the spreadsheet. After looking at it for a little bit, it was pretty clear the source of the discrepancy was that the tool and the spreadsheet are inconsistent.
I tried reproducing a few results of the queries in the spreadsheet using the tool. After about a dozen, it was clear the spreadsheet (compared to the tool) had a consistently higher scores for the first result, and consistently lower scores for the other results. That is technically referred to as the ah shit moment.
I reviewed the code that differs for the web page and the spreadsheet, and found a bug that explains this. When generating scores for the spreadsheet, it caused the topics for the query to be largely replaced with topics for the first result. This made the first result to be scored too highly, and later results to be scored lower.
Excluding the first result from every SERP, the bug actually made the results less correlated in the spreadsheet, but the help getting the first result correct was enough to boost the correlation up a lot.
A Few Related Thoughts:
- When I release statistics in the future, I will continue to try to ensure we provide enough data to verify (or in this case show a flaw with) the calculation. Although I found the bug, it was only a matter of time before someone else would try reproducing a few of the queries in the tool and see the discrepancy. So releasing data is a good way to ensure mistakes get discovered.
- The actual expected correlation coefficient, 0.17, still is, at least to us at SEOmoz, exciting. But the smaller number is less exciting, and it really really sucks I first reported the expected value for the coefficient as 0.32.
- Some have claimed there is something invalid with measuring correlation by reporting the expected value of Spearman’s correlation coefficients for SERPs. They are still wrong. Two wrongs don’t make a right. My programming mistake doesn’t invalidate any of the arguments I’ve made about the math behind the methodology.
- Mistakes suck. I feel shitty about it. I’m particularly sorry to anyone who got really excited about 0.32.
Here is a corrected spreadsheet and below is a corrected chart. For historical purposes, I’ll leave the incorrect spreadsheet available. I’ll edit the two prior LDA blog posts to include links to this one.
Posted by JoannaLord
Well yesterday was a big day on Twitter, wasn’t it? I don’t know about you but I was glued to the live stream of the not-so top secret Twitter press conference at exactly 3:30 pm and watched closely for an hour and a half while @Ev and @Biz told us all about the new "bigger and better" Twitter.com. The founders outlined many of the recent achievements they have seen with the growth of their community and announced the release of a brand new interface for Twitter.com, which will be rolling out to all users over the new few weeks (it’s important to note that currently only 1% of users have access to the redesign, that decision was not so well received.)
The new interface has a renewed focus on the user experience with in stream multi-media expansions, more search capabilities, and an all around sexier more fluid feeling. I went crazy yesterday playing with the new interface and wanted to share way too many screenshots and my thoughts on the new layout. I am excited to hear what you guys think all of these changes mean, so let’s do this, shall we? What are the big changes to our beloved Twitter.com?
1. Redirect users back to THEIR WEBSITE – Whoa!
I have to admit I got a little fiesty yesterday when I saw my stream fill up with tweets that said things like "that is it?!" and "its just a new interface, what’s the big deal?!" Twitter has over 160 million users, but as we all know many of those users use second party Twitter clients rather than the web interface itself. Ev noted yesterday at the conference that Twitter mobile users are up 250% year over year, which was the motivation for them to release their own mobile apps earlier this year. While this mobile surge has meant huge growth for the community it hasn’t done as much for their on-site value. The announcement yesterday was important because it was their first real attempt to redirect those millions of users to a more compelling on-site experience. Whatever the long term goal is for Twitter.com the website, yesterday’s announcement was a huge step toward a more united community of users. This.is.a.big.deal.folks.
(The new Twitter.com… ohhh pretty!)
2. A whole lot more space for …. uhmmmm advertisements?
So now that we have refocused our attention and time back to Twitter.com what will they do with it? Well sell us things obviously. As you can see below there sure is a lot more space for Twitter to fill. You will notice the "Sponsored Tweets" and the "Who to Follow" elements are more prominent. In addition to that you will see some open areas (that look a lot like traditional ad space units) laced throughout the platform. In general I think its pretty clear that they used this UI redesign to give themselves more options for the up and coming advertising platform we keep hearing about.
(Notice all that space they get to play with!)
3. Focus on other tweets, searches…you know uhmmm NOT your tweet
During the press conference Ev mentioned specifically that Twitter is a unique community of users in that not everyone actually tweets. He noted plenty of people use it just to listen or research…very "search enginey" if you ask me (yes I just made that word up). The new design certainly focuses less on my actual tweet and more on the experience I am having as a Twitter user. You will see the "search box" was moved to top right, and has much more functionality than previously. I can see tweets with my searched word(s), "tweets with links" & that word, "tweets near me" with that word, and see profiles or people that include that searched word. This is a far better experience all around if you ask me, again compelling users to stay on Twitter.com rather than leave and search elsewhere. Smart move people, smart move.
(New search experience…man I love Pumpkin Spice lattes from Starbucks)
4. Media, media, media oh my!
This is probably the change you are hearing most about. The new platform has the ability to view pictures and video in stream, by expanding from the left column (your tweet stream) to the right column (now used more as an expanded view). In addition to seeing whatever multi media you clicked on you will also see people mentioned in the tweet you expanded, a brief history of that user’s tweets, and the latest tweet that tweet may have been in response too. Uhmmm sound confusing? Basically the expanded view of any tweet is now much more of a comprehensive story of that tweet. No longer on the web client will you be clicking from profile to profile to read a full conversation and get context. This new layout has put the story of a tweet together for you in one place. It’s smooth, trust me…you will like it!
(The new platform when you expand an image… Hi Matt!)
(The new platform with expanded video…ohhh puppy!)
5. All sorts of other little things
- You are not losing your backgrounds (phew!). Atleast right now we still have them. Also you might want to revisit your right column profile color–it’s bigger now.
- Direct messages are up in your navigation (quite seperate from the other functionality actually) and are much more streamlined in my opinion. You now click in and see the number of DM exchanges, and can expand to see them all clearly. I was happy to see this. However you no longer see a "number" which was the only way us web client users knew if we had a new DM (unless we got an email notification) so be careful not to miss those new DMs!
- The new platform still does not support multiple users, sorry folks!
- Retweets. I still don’t really like them, so don’t hate me when I say that I am stoked they made the ability to shut off retweets from someone so much easier! It’s in there next to the option to get a user’s tweets on your cell. Both options are right there and a simple click to change. Easy smeasy for sure.
- The new platform makes replying to multiple people challenging. No longer can you hit reply and aggregate user handles in one tweet, each "reply" click pulls up an individual tweet box. Ugh, yuck. I hope they change this soon.
(When you hit reply a box pops-up…still a bit buggy right now)
- "Trends" have some serious face time. I think we will find a lot more focus as marketers on getting our topics on the "trend" list (organically or not maybe eventually purchased) as I can imagine this will be much like scoring first page Digg time…similar atleast. You can see they are now top right, whoa in your face!
- They are calling this a "preview" on the interface, and when you get it you will have a notification box where you must manually click into it. You can also (atleast right now, I guess its going away in a few weeks) chose to "leave the preview" and return to your old interface. I don’t think you will want to, but to each their own
That about sums up the big changes I am seeing. As for what it all means? I think this is a renewed focus on Twitter.com – the site not Twitter - the company. Both Evan and Biz alluded to lots of changes coming down the pipeline, and there is a clear energy of excitement in the stream. I don’t know about you but I am certainly going to playing around more on the web interface both as a user and a marketer. I think we will have some interesting opportunities coming our way…uhmmm both as users and as marketers