Wednesday 28 March 2018

7 Ways a Mobile First Index Impacts SEO

If you don’t like change then the Internet is not for you. Google just changed how they’re indexing sites and announced more changes on the way. I’ve identified seven insights about a mobile first index and how that may influence rankings and SEO.

1. Mobile First Algo is Nuanced

It may be inappropriate to generalize what kind of content is best for a mobile first index. First, consider that every search query is different. Here is a sample of a few kinds of queries:
  • Long tail queries
  • Informational queries (what actor starred in…)
  • Local search queries
  • Transactional queries
  • Research queries
  • How do I queries?
Each one of those search queries can be answered by a different kind of web page, with different content length, with different needs for diagrams, maps, depth, shallowness and so on.
One simply cannot generalize and say that Google prefers short form content because that’s what mobile users prefer. Thinking in terms of what most mobile users might prefer is a great start. But the next step involves thinking about the problem that specific search query is trying to solve and what the best solution for the most users is going to be.
And as you’ll read below, for some queries the most popular answer might vary according to time. Google’s mobile first announcement explicitly stated that for some queries a desktop version might be appropriate.

2. Satisfy The Most Users

Identifying the problem users are trying to solve can lead to multiple answers. If you look at the SERPs you will see there are different kinds of sites. Some might be review sites, some might be informational, some might be educational.
Those differences are indications that there multiple problems users are trying to solve. What’s helpful is that Google is highly likely to order the SERPs according to the most popular user intent, the answer that satisfies the most users.
So if you want to know which kind of answer to give on a page, take a look at the SERPs and let the SERPs guide you. Sometimes this means that most users tend to be on mobile and short form content works best. Sometimes it’s fifty/fifty and most users prefer in depth content or multiple product choices or less product choices.
Don’t be afraid of the mobile index. It’s not changing much. It’s simply adding an additional layer, to understand which kind of content satisfies the typical user (mobile, laptop, desktop, combination) and the user intent. It’s just an extra step to understanding who the most  users are and from there asking how to satisfy them, that’s all.

3. Time Influences Observed User Intent

Every search query demands a specific kind of result because the user intent behind each query is different. Mobile adds an additional layer of intent to search queries. In a Think with Google publication about how people use their devices (PDF), Google stated this
“The proliferation of devices has changed the way people interact with the world around them. With more touchpoints than ever before, it’s critical that marketers have a full understanding of how people use devices so that they can be here and be useful for their customers in the moments that matter.”

Time plays a role in how the user intent changes. The time of day that a query is made can influence what device that user is using, which in turn says something about that users needs in terms of speed, convenience and information needs. Google’s research from the above cited document states this:
“Mobile leads in the morning, but computers become dominant around 8 a.m. when people might start their workday. Mobile takes the lead again in the late afternoon when people might be on the go, and continues to increase into the evening, spiking around primetime viewing hours.”
This is what I mean when I say that Google’s mobile index is introducing a new layer of what it means to be relevant. It’s not about your on page keywords being relevant to what a user is typing. A new consideration is about how your web page is relevant to someone at a certain time of day on a certain device and how you’re going to solve the most popular information need at that time of day.
Google’s March 2018 official mobile first announcement stated it like this:
“We may show content to users that’s not mobile-friendly or that is slow loading if our many other signals determine it is the most relevant content to show.”
What signals is Google looking at? Obviously, the device itself could be a signal. But also, according to Google, time of day might be a signal because not only does device usage fluctuate during the day but the intent does too.

4. Defining Relevance in a Mobile First Index

Google’s focus on the User Intent 100% changes what the phrase “relevant content” means, especially in a mobile first index. People on different devices search for different things. It’s not that the mobile index itself is changing what is going to be ranked. The user intent for search queries is constantly changing, sometimes in response to Google’s ability to better understand what that intent is.
Some of those core algorithm updates could be changes related to how Google understands what satisfies users. You know how SEOs are worrying about click through data? They are missing an important metric. CTR is not the only measurement tool search engines have.
Do you think CTR 100% tells what’s going on in a mobile first index? How can Google understand if a SERP solved a user’s problem if the user does not even click through?
That’s where a metric similar to Viewport Time comes in.  Search engines have been using variations of Viewport Time to understand mobile users. Yet the SEO industry is still wringing it’s hands about CTR. Ever feel like a piece of the ranking puzzle is missing? This is one of those pieces.
The Best of SEJSUMMIT Webinar - Wed, March 28, 2:00 PM EST
Join Duane Forrester on the first-ever BOSS webinar as he offers actionable takeaways and thought-provoking content on the future of voice search.
ADVERTISEMENT
Google’s understanding of what satisfies users is constantly improving. And that impacts the rankings. How we provide the best experience for those queries should change, too.
An important way those solutions have changed involves understanding the demographics of who is using a specific kind of device. What does it mean when someone asks a question on one device versus another device? One answer is that the age group might influence who is asking a certain question on a certain device.
For example, Google shared the following insights about mobile and desktop users (PDF). Searchers in the Beauty and Health niche search for different kinds of things according to device.
Examples of top beauty & health queries on mobile devices are for topics related to tattoos and nail salons. Examples of Beauty & Health desktop queries indicate an older user because they’re searching for stores like Saks and beauty products such anti-aging creams.

It’s naïve to worry about whether you have enough synonyms on your page. That’s not what relevance is about. Relevance is not about keyword synonyms. Relevance is often about problem solving at certain times of day and within specific devices to specific age groups. You can’t solve that by salting your web page with synonyms.

5. Mobile First is not About User Friendliness

An important quality of the mobile first index is convenience when satisfying a user intent. Does the user intent behind the search query demand a quick answer or a shorter answer? Does the web page make it hard to find the answer? Does the page enable comparison between different products?
Now answer those questions by adding the phrase, on mobile, on a tablet, on a desktop and so on.

6. Would a Visitor Understand your Content?

Google can know if a user understands your content. Users vote with their click and viewport time data and quality raters create another layer of data about certain queries. With enough data Google can predict it what a user might find useful. This is where machine learning comes in.
Here’s what Google says about machine learning in the context of User Experience (UX):
Machine learning is the science of making predictions based on patterns and relationships that’ve been automatically discovered in data.
If content that is difficult to read is a turn-off, that may be reflected in what sites are ranked and what sites are not. If the topic is complex and a complex answer solves the problem then that might be judged the best answer.
I know we’re talking about Google but it’s useful to understand the state of the art of search in general.  Microsoft published a fascinating study about teaching a machine  to predict what a will find interesting. The paper is titled, Predicting Interesting Things in Text. This research focused on understanding what made content interesting and what caused users to keep clicking to another page.  In other words, it was about training a machine to understand what satisfies users. Here’s a synopsis:
We propose models of “interestingness”, which aim to predict the level of interest a user has in the various text spans in a document. We obtain naturally occurring interest signals by observing
user browsing behavior in clicks from one page to another. We cast the problem of predicting interestingness as a discriminative learning problem over this data.
We train and test our models on millions of real world transitions between Wikipedia documents as observed from web browser session logs. On the task of predicting which spans are of most interest to users, we show significant improvement over various baselines and highlight the value of our latent semantic model.
In general, I find good results with content that can be appreciated by the widest variety of people. This isn’t strictly a mobile first consideration but it is increasingly important in an Internet where so people of diverse backgrounds are accessing a site with multiple intents multiple kinds of devices. Achieving universal popularity becomes increasingly difficult so it may be advantageous to appeal to the broadest array of people in a mobile first index.

7. Google’s Algo Intent Hasn’t Changed

Looked at a certain way, it could be said that Google’s desire to show users what they want to see has remained consistent. What has changed is the users age, what they desire, when they desire it and what device they desire it on. So the intent of Google’s algorithm likely remains the same.
The mobile first index can be seen as a logical response to how users have changed. It’s backwards to think of it as Google forcing web publishers to adapt to Google. What’s really happening is that web publishers must adapt to how their users have changed. Ultimately that is the best way to think of the mobile first index. Not as a response to what Google wants but to approach the problem as a response to the evolving needs of the user.

Starting From Scratch: The Simple Guide To Social Media And SEO Success

Pexels
Viral marketing and building a social media following are common initiatives among startups and established businesses alike. Every executive and entrepreneur thinks they need to create a viral piece of content and get extremely lucky to win viral publicity. But if that's your strategy, you have little chance of success. Instead, ride an already existing viral wave.
Don't just create content and hope it goes viral. Use content that has proven to be viral and then post it to your website and social media accounts to gain web traffic and followers. For this to work, you need to have established social media profiles. You don't need to have a hundred thousand followers, but you do need real followers who are related to your niche.
1. Create social media accounts.
The first step is having company social media accounts across major social channels that pertain to your target market. These may include LinkedIn, Pinterest, Instagram, Facebook, Twitter, Tumblr and Reddit. These social media channels are all high-authority websites that help your company rank high in search engines. The more content you post on social media that receives shares, likes and comments, the higher your company's SEO ranking will climb.

2. Identify your customers.
Your dream customers are already online and are most likely following the top leaders in your niche. Tap into that audience and redirect them to become focused on your offerings. For example, if your niche is B2B sales consulting, figure out where that audience is online now. Make a list of all the B2B sales leaders like Grant Cardone, John Barrows, Jeffrey Gitomer, Jay Abraham and Tony Robbins.
3. Follow your competitors' audience.
Once you have identified who your dream customers are, the next step is to engage them. Follow your indirect and direct competitors' audiences on social media. Spend some time identifying power followers -- followers with their own audiences of over 2000. Apply this strategy across Facebook, Twitter, Instagram and LinkedIn. This works well because if you're in the same niche as the accounts they are currently following, there is a good chance they will want to follow you as well. These power followers tend to look at their audiences carefully as they build their networks. Your goal should be to spend 30 days adding at least 200 people on each social channel until you follow a few thousand people. Once you've done this, you should see at least a couple thousand people on each one of your social media accounts now following you in return.

Hijacking Google search results for fun, not profit: UK SEO uncovers XML sitemap exploit in Google Search Console

SEO wins bug bounty from Vulnerability Reward Program, Google search team confirms the exploit no longer works

In 2017, Google paid nearly $3 million to individuals and researchers as part of their Vulnerability Reward Program (VRP), which encourages the security research community to find and report vulnerabilities in Google products.
This week, Tom Anthony — who heads Product Research & Development at Distilled, an SEO agency — was awarded a bug bounty of $1,337 for discovering an exploit that enabled one site to hijack the search engine results page (SERP) visibility and traffic of another — quickly getting indexed and easily ranking for the victimized site’s competitive keywords.
Detailed in his blog post, Anthony describes how Google’s Search Console (GSC) sitemap submission via ping URL essentially allowed him to submit an XML sitemap for a site he does control, as if it were a sitemap for one he does not. He did this by first finding a target site that allowed open redirects; scraping its contents and creating a duplicate of that site (and its URL structures) on a test server. He then submitted an XML sitemap to Google (hosted on the test server) that included URLs for the targeted domain with hreflang directives pointing to those same URLs, now also present on the test domain.

Hijacking the SERPs

Within 48 hours, the test domain started receiving traffic. Within the week, the test site was ranking for competitive terms on page 1 of the SERPs. Also, GSC showed the two sites as related — listing the targeted site as linking to the test site:
Google Search Console links the two unrelated sites. Source: http://www.tomanthony.co.uk
This presumed relationship also allowed Anthony to submit other XML sitemaps — within the test site’s GSC at this point, not via ping URL — for the targeted site:
Victim site sitemap uploaded directly in GSC – Source: http://www.tomanthony.co.uk

Understanding the scope

Open redirects themselves are not a new or novel problem – and Google has been warning webmasters about shoring up their sites against this attack vector since 2009. What is noteworthy here is that utilizing an open redirect worked to not just submit a rogue sitemap, but to effectively rank a brand-new domain, brand-new site, with zero actual inbound links, and no promotion. And then to get that brand-new site and domain over a million search impressions, 10,000 unique visitors and 40,000 page views (via search traffic only) in three weeks.
The “bug” here is both a problem with sitemap submissions (the subsequent sail-through GSC sitemap submissions are alarming) and a greater problem as to how the algorithm immediately applied all the equity from the one site across to the completely separate and unrelated domain.
Source: http://www.tomanthony.co.uk
I reached out to Google with a series of detailed questions about this exploit, including the search quality team’s involvement in pursuing and implementing a fix, and whether or not they are able to detect and take action on any bad actors that may have already exploited this vulnerability. A Google spokesperson replied:
When we were alerted to the issue, we worked closely across teams to address it. It was not a previously known issue and we don’t believe it had been used.
In response to questions about changes with respect to sitemap submissions, GSC and the transfer of equity affecting results, the spokesperson said:
We continue to recommend that site-owners use sitemaps to let us know about new & updated pages within their website. Additionally, the new Search Console also uses sitemaps as a way of drilling down into specific information within your website in the Index Coverage report. If you’re hosting your sitemaps outside of your website, for proper usage it’s important that you have both sites verified in the same Search Console account.
I discussed this exploit and the research at length with Anthony.

The research process

When asked about his motivations for pursuing this work, he said, “I believe an effective SEO is someone who experiments and tries to understand things behind the scenes. I’ve never done any black-hat SEO, and so set myself the challenge of finding something on that side of things; primarily for the learning experience and as a way to run defense if I ever saw it in the wild.”
He added, “I like doing security research as a hobby on the side, so decided that rather than take the ‘traditional’ black-hat route of manipulating the algorithm’s ranking signals, I’d see if I could instead find an outright bug in it.”
Oftentimes, the driving motivation in pursuing a given method relates to having experienced (or having a client that has experienced) a sudden drop in SERP traffic or rankings. Anthony noted, “At Distilled, like so many SEOs, I’ve worked with sites that have had unexplained drops. Often clients claim ‘negative SEO,’ but usually it is something far more mundane. What is worrying about this specific issue is [that] typical negative SEO attacks are detectable. If I spam you with low-quality links, you can find them, you can confirm they exist. With this issue, it appears an attacker could leverage your equity in Google and you would not know.”
Over the course of four weeks’ evenings and weekends spent delving into it, Anthony discovered that combining different research streams he’d begun proved effective where each separately led to dead ends. “I had ended up with two threads of research — one around open redirects, as they are a crack in how sites work that I felt could be leveraged for SEO — and the other was with XML sitemaps and trying to make Googlebot error out when parsing them (I ran about 20 variations of that, but none worked!). I was so deep into it at this point, and had a revelation when I realized these two streams of research could perhaps be combined.”

Reporting and resolution

Once he realized the impact and harm that could be done to sites, Anthony reported the bug to Google’s security team (See complete timeline in his post). As this method was previously unknown to Google but clearly exploitable, Anthony noted, “It is a terrifying prospect that this could have already been out there and being exploited. However, the nature of the bug would mean it is essentially undetectable. The ‘victim’ may not be affected directly if their equity is used to rank in another country, and then the victims become the legitimate companies who are pushed down the rankings by the attacker. They would have no way to tell how the attacker site was ranking so well.”
As noted above, the Google spokesperson said they do not believe it has been used. Unclear from their response is whether or not they have data available that would enable them to detect pinged sitemaps used in such a way. If further comment or information is given, we’ll update this post.
On the issue of detection specifically, I asked Anthony to speculate on scaling this exploit. “The biggest weakness with my experiment was how closely I mimicked the original site in terms of URL structure and content. I had a bunch of experiments prepped that were designed to measure just how different you could make the attacker site: Do I need the same URL structure as the parent site? How similar must the content be? Can I target other languages in the same country as the victim site? In my case, I think I could have re-run with the same approach but have differentiated the attack site slightly more, and probably [would] have escaped detection,” he said.
He added, “If I had kept it to myself, then I imagine I could have gone for months or years. If you outright scammed people it would be short-lived, but if you used the method to drive affiliate traffic, or even simply to boost your own legitimate business, then little reason you’d ever be caught.”
As the image below demonstrates, the short-lived traffic driven to the test site was potentially far more valuable than the relatively small (by comparison) bounty he was awarded, which makes one wonder if the security team really understood the implications of the exploit.

Anthony’s motivations (and why he did report the vulnerability right away) were rooted in research and helping the search community, however.
“Doing this sort of research is a learning experience, and not about abusing what you find. In the industry, we have our complaints about Google at times, but [to] a consumer, they provide a great service, and I think good SEOs actually help with that — and this is basically an extension of the same idea. The Vulnerability Reward Program they run is a nice incentive to focus research efforts on them rather than elsewhere; it is nice to potentially be awarded a bounty for the time and effort that goes into the research.”

Better Local SEO In 2018 Courtesy Of Google My Business SEO

Many businesses today have started seeing the importance of SEO and its role in improving their online presence. SEO has the power to increase leads interested in their products, services and information. With proper SEO, a good percentage of these leads can easily be converted into customers. In 2018, Google has ensured that businesses can benefit more if they employ ethical SEO efforts. This includes Google My Business SEO.
For businesses interested in getting a better online visibility when it comes to local search, Google My Business has a number of solutions. Google My Business SEO can help a business get spotted on Google maps and also get reviews from their customers. With visibility and customer input, it is easy for a business to improve their visibility online.
Talking about the analytics tool update, Google My Business community manager, Allyson Wright said that Google, “made some changes to your directions heatmap. Specifically, the new directions heatmap should allow businesses to track which areas customers are requesting directions from and show that to you on this map at various zoom levels. Direction requests from some post codes may appear as blank due to user privacy consideration but are included at the city level.”
According to Jack Lombardi, CEO of Chicago Website Design SEO Company “In a survey, 89% of the participants said the search for local businesses on their smart phones at least once a week. 58% of the participants said that they search for local businesses at least once in a day on their mobile phone. With such statistics, it is important that you are where your potential customers are searching. You want to top a list because this not only gives you authority. You will also get an advantage over your competitors since most people will choose you first.”
Any business interested in making changes in 2018 in connection with visibility, Google My Business will go a long way in making sure that people can see their business. Online visibility is only important if your target audience can see the business. This means people who can actually visit the business. The best way to attract these individuals is through Google My Business search engine optimization. The secret is all about working with the right expert to have Google My Business optimized. This allows business owners to continue with their service and products provision as the experts handle the SEO strategies.

Tuesday 27 March 2018

SEO for multi-language websites: How to speak your customers’ language

Combining international and multilingual SEO can get complicated. Contributor Marcus Miller walks us through the details of using International SEO tactics and multi-language content on global websites.
In my previous column, I took a look at the options, intricacies and best practices for international SEO. In this article, I want to build on those lessons and detail how to tackle multilingual websites.
As with international search engine optimization (SEO), there are many scenarios, and the right solution depends very much on the specific situation. Do you target one country where users speak multiple languages? Do you target specific languages around the world? Do you want a specific language for a specific country? In many cases, the solution will be a combination of all of these.
Combining international and multilingual SEO can get complicated. Mistakes can cost time and money while slowing your progress towards your SEO objectives. But knowledge is power, and ensuring you understand the options is key to success.

Creating content in multiple languages

There are a few common scenarios when creating content in multiple languages. Determining which of these matches your situation is key to making the right decision when building your site and tackling your website SEO.
The three main scenarios we see when building multi-language websites are:
  • Multiple languages serving the same country.
  • Multiple languages serving no specific country.
  • Multiple languages serving multiple countries.
Let’s look at each of these in a little more detail so you can understand what is the right choice for you.

Multiple languages serving the same country

Canada is a good example, since it is one country with two official languages, English and French.
Source: WorldAtlas
Here we could have a single website serving a single country with multiple languages. In this case, we would want to use a .ca country code Top-Level Domain (ccTLD) for Canada to automatically geo-locate the site and then have content in English and French to target French- and English-language queries.
Sometimes it is easier to consider what can go wrong here:
  • Is the English language version for the United Kingdom (UK)?
  • Is the English language version for the United States (US)?
  • For Australia? Or all the above?
  • Is the French version for France?
To ensure the search engine understands your site, geo-targeting and language targeting multilingual SEO tactics should include:
  • ccTLD for the country being served to benefit from default geo-targeting.
  • Single website with language-specific content in subfolders English and French: /en/ & /fr/.
  • Site hosted in the country that is being targeted.
  • Hreflang tags specifying language and country.
  • Links from relevant specific language websites.
With all of these steps followed, a search engine has all the pointers needed to know that this content is for English and French language speakers in Canada.

Multiple languages serving no specific country

Here we have a situation where we are targeting users based predominantly on their language.
We are not concerned if an English speaker is in the UK, the US or Australia or any other English language speaking location (small differences in spelling aside).
We don’t care if this is an Englishman in a country that speaks another language. As additional languages are added, they target speakers of that language around the world with no geographical bias.
Imagine a company that provides a software solution around the world. This business will want to have content in each language and have search engine users find the correct language version of the content.
So, an English speaking visitor in the UK, the US, Canada or Australia would all get the same content. A French speaker in France or Canada would also get the same French content.
Options here are a little more diverse. This is where considerations from the real-world and business operations become crucial in making the right decision (as discussed in more detail in my international SEO guide).
The tactic we recommend in this scenario is a single site with the following multi-language SEO tactics in place to support the desired ranking goals:
  • A generic TLD such as a .com that can target multiple countries.
  • Single website with language-specific content in subfolders (e.g., /en/, /fr/, /de/).
  • Site hosted in primary market with an international content delivery network.
  • Hreflang tags with language-only specified (not location).
  • Links from relevant specific language websites.
As the world gets smaller and subscription-based software solutions become ever more popular, this kind of setup is a simple way to target multiple languages across the globe.

Multiple languages serving multiple countries

This is where things can get a little more complicated because we may have multiple versions of the same language with nearly duplicate content, so technical configuration needs to be 100 percent accurate.
We may have a site in English and French, and we may have an English language section for each of the UK, the US, Australia and Canada, along with a French page for France and Canada.
This is fairly basic: two languages and five locations. We have seen this get a lot more complicated, and if it confuses you, then the odds of tripping up a search engine are amplified!
Get this wrong and your rankings go down the international SEO tube.
Tactics here for a single site include:
  • A generic TLD such as a .com that can target multiple countries.
  • Default location and language (US English in this example).
  • Country-specific subfolders (gb/, au/, ca/, fr/).
  • Language-specific subfolders below the country-specific subfolders (gb/en/, ca/fr/).
  • Site hosted in primary market with an international content delivery network.
  • Hreflang tags with language and location specified.
  • Relevant links from location- and language-specific websites.
This is a straightforward way to achieve the targeting of multiple languages in multiple locations.

8 of the Worst SEO Mistakes Even the Experts Make

8 of the Worst SEO Mistakes Even the Experts Make
Digital marketing is like playing the drums; everyone thinks they can do it.
Inevitably, the layman writes content stuffed to the brim with a target keyword and cannibalizes his/her own webpages by using the same five keywords across all of their webpages.
As infallible as we sometimes think we are, even the best of our industry can make some pretty hairbrained mistakes.
Sometimes the best way to move forward is to take a step back and go back to SEO basics.
As Google and Bing’s algorithms continue to evolve and incorporate new technologies for search, so do our strategies.
Between optimizing our content for voice search, desktop visitors, mobile swipers, and our social media followers, the task can feel impossible and overwhelming.
Breathe a little, you’re not alone.
As much as the medium may change, the same principles still remain in place and so too do the same basic errors.
Here are eight common SEO mistakes that even the experts still make.

1. Presenting a Poor Internal Link Structure

As your website balloons in size with all of your awesome content, you’re bound to encounter some pretty basic internal linking errors. This includes everything from producing mass duplicate content to 404 page errors cropping up.
I think internal linking structures are vastly overlooked by webmasters, yet it serves one of the most valuable functions in your UX and SEO strategy.
Internal links provide five valuable functions for your website:
  • Providing clear pathways to conversion pages.
  • Spreading authority to webpages hidden deep on your site.
  • Providing additional reading or interactive material for users to consume on your site.
  • Organizing webpages categorically by keyword-optimized anchor text.
  • Communicating your most important webpages to search engine crawlers.
Resubmitting an XML sitemap to search engines is a great way to open up crawl paths for search engines to unlinked webpages.
Along the same lines, it’s important to use your robots.txt file and noindex tag wisely so that you don’t accidentally block important webpages on your site or a client’s.
As a general rule of thumb, no webpage should be more than two clicks away from the homepage or a call-to-action landing page.
Reassess your website architecture using fresh keyword research to begin organizing webpages by topicality.
HubSpot provides a great guide for creating topic clusters on your website that arrange webpages by topic, using semantic keywords, and hierarchy to their shared thesis.
cluster model

2. Creating Content for Content’s Sake

Best practices dictate that you should produce content consistently to increase your brand’s exposure and authority, as well as increase your website’s indexation rate.
But as your website grows to hundreds of pages or more, it becomes difficult to find unique keywords for each page and stick to a cohesive strategy.
Sometimes we fall for the fallacy that we must produce content just to have more of it. That’s simply untrue and leads to thin and useless content, which amounts to wasted resources.
Don’t write content without completing strategic keyword research beforehand.
Make sure the content is relevant to the target keyword and utilizes closely associated keywords in H2 tags and body paragraphs.
This will convey full context of your content to search engines and meet user intent on multiple levels.
Take the time to invest in long-form content that is actionable and evergreen. Remember, we are content marketers and SEO specialists, not journalists.
Optimized content can take months to reach page one results; make sure it remains relevant and unique to its industry when it does.

3. Not Investing in Link-Worthy Content

As we understand it, the quantity and quality of unique referring domains to a webpage is one of Google’s three most important ranking factors.
Link building is a major industry pull for agencies. But going out and pursuing mass links through guest posting, manual outreach, and influencer marketing can be costly and resource intensive.
The best way to acquire links is naturally, leveraging stellar content that people just want to link to.
Instead of investing time in manual research and creating hundreds of guest posts a year, why not invest in a piece of content that can acquire all of those links in one day of writing?
Again, I bring up HubSpot, which provides a great example of this. Every year, they provide a list of industry statistics they scour from the internet, such as “The Ultimate List of Marketing Statistics”, which serves as an invaluable resource for anyone in the digital marketing industry.
As previously stated, invest the time in crafting long-form content that adds value to the industry.
Here, you can experiment with different forms of content, whether it’s a resource page, infographic, interactive quiz, or evergreen guide.
Dedicate some of your manual outreach strategy to promote a piece of content published on your own website and not someone else’s.

4. Failing to Reach Customers with Your Content

Continuing this discussion, you need to have a strategy in place to actually get people to view your content.
I believe that much of the industry and many businesses don’t invest as many resources into content promotion as they do production.
Sure, you share your content over social media, but how much reach does it actually acquire without paid advertising?
Simply posting your latest article on your blog, social media channel, and e-newsletter limits its reach to a small percentage of your existing audience.
If you’re looking to acquire new leads for your business, then you’ll need to invest more resources into promotional tactics. Some strategies include:
The Best of SEJSUMMIT Webinar - Wed, March 28, 2:00 PM EST
Join Duane Forrester on the first-ever BOSS webinar as he offers actionable takeaways and thought-provoking content on the future of voice search.
ADVERTISEMENT
  • Paid social campaigns.
  • Targeted sharing using keyword optimized hashtags.
  • Promoting content over influencer channels.
  • Link building.
While it’s rather chicken and egg, you need to promote content to get links to it. Only then can you begin to acquire more links organically.

5. Optimizing for the Wrong Keywords

So you invested the time in crafting a piece of long-form content, but it’s not driving large-scale traffic to your website.
Just as bad, your visitors have low time on page and are not converting.
More than likely, you’re optimizing for the wrong keywords.
While most of us understand the importance of long-tail keywords for informational queries, sometimes we run into some common mistakes:
  • Failing to segment search volumes and competition by geography.
  • Relying too much on high volume phrases that don’t convert.
  • Focusing too many resources on broad keywords (external links, internal link anchor text, etc.).
  • Ignoring click-through rates.
  • Trying to insert awkward exact match phrases into content.
  • Ignoring AdWords value.
  • Allocating target keywords to irrelevant content.
  • Choosing keywords irrelevant to your audience.
It’s important to actually research the search phrases that appear in top results for both national and local searches.
Talk to your customers to see what search phrases they use to describe different elements of your industry. From here, you can segment your keyword list to make it more relevant to your customers.
Use keyword tools like Google Keyword Planner and SEMrush’s keyword generator for relevant keyword ideas.
Don’t forget to optimize for informational and commercial search queries.

6. Not Consulting Paid Media

As the industry currently stands, SEO focuses on acquiring and nurturing leads, while paid media focuses on acquiring and converting leads.
But what if we broke down those silos to create a cohesive message that targeted the buyer at every step of the journey?
Buyers-Journey-and-Digital-Marketing-Funnel
As an SEO provider, do you even know what your client’s advertising message is or the keywords they use? Are you promoting the same products/service pages with the same keywords as the paid media department?
There is a lot of insight that SEO consultants can learn from PPC keyword research and landing page performances that can aid them in their own campaign.
Beyond this, Facebook and Twitter’s advertising platform offer robust audience analysis tools that SEO consultants can use to better understand their client’s customers.
By focusing on a unified message and sharing in each other’s research, SEO consultants can discover keywords that convert the highest and drive the most clicks in the search results.

7. Forgetting About Local

Google’s Pigeon update completely opened up an entirely new field of local SEO.
Between local directory reviews, customizing a Google My Business page, and the local three-pack, local SEO is highly targeted and high converting.
Consider some of the statistics:
  • 50 percent of searches over a mobile device result in an in-store visit that day.
  • Half of local, mobile searches are for local business information.
  • Anywhere between 80-90 percent of people read an online review before making a purchase.
  • 85 percent of people trust reviews as much as personal recommendations.
It’s important to segment your keyword research for both local and national intent.
If you provide local services, be sure to create content that reflects local intent, such as including city names next to target keywords and in the body of content.
While most of us focus on growing business at the national scale, the importance of local SEO should not be ignored.

8. Not Regularly Auditing Your Own Website

One of the biggest mistakes we all make is not continuing to optimize our own site and fix mistakes that crop up over time.
A site audit is especially important after a site migration or implementation of any new tools or plugins.
Common technical mistakes that occur over time include:
  • Duplicate content.
  • Broken links.
  • Slow site speed through oversized images or poor JavaScript implementation.
  • Unoptimized meta tags.
Duplicate content can occur for a number of reasons, whether through pagination or session IDs.
Resolve any URL parameter errors or duplicate content from your cookies by inserting canonicals on source webpages. This allows all signals from duplicate pages to point back to the source page.
Broken links are inevitable as you move content around your site, so it’s important to insert 301 redirects to a relevant webpage on any content you remove. Be sure to resolve 302 redirects, as these only serve as a temporary redirect.
Auditing your website is paramount for mobile search. Simply having a responsive web design or AMP is not enough.
Be sure to minify your CSS and JS on your mobile design, as well as shrink images, to provide a fast and responsive design.
Finally, one part of the audit that is often overlooked is reevaluating your onsite content strategy. Most industries are dynamic, meaning that new innovations crop up and certain services become obsolete overtime.
search-volume-trends
Remodel your website to reflect any new product offerings you have. Create content around that topic to showcase its importance to your hierarchy to both search engines and users.
Continually refresh your keyword research and audience research to find new opportunities to scale and stay relevant.

Final Thoughts

Everyone is susceptible to mistakes in their craft and one of the best ways to rectify them is to consult the best practices.
My best bit of advice: Keep your mind nimble and always take a step back here and there to evaluate whether you are doing the best to scale your or a client’s business.

5 SEO myths debunked Marketing

Search engine optimization is not a new technique. It has been there for years, but there are still some rumors floating in the air regarding SEO.

This is not a good thing as many people who are new to marketing businesses online fall for these myths and look at SEO in a negative light.
The matter of the fact is that SEO is still and will continue to be one of the most important practices when it comes to boosting the visibility of a website.
Here are 5 SEO myths and the truth about them:
  1. Videos And Images Have Nothing To Do With SEO
This is nothing but a myth because in reality, videos and images serve a great role in SEO.
There are numerous images on a website and they all need to be properly optimized or else they won’t direct a user to your website. For instance, when you name an image relevant to a product or content, along with alt image name it contributes to SEO.
Also make sure to use the right dimension when it comes to pictures, use image captions and make sure to only use relevant images that are also not copyrighted.
Videos too have a great impact on SEO. They help deliver a message in the most convenient and cheapest of ways.
Videos help target relevant audience and increase engagement which is basically a part of what SEO does. Hence, creative, informative and engaging videos help increase visibility of a website.

  1. SEO IS A One Time Thing
Many people think that all they need to do is implement top notch SEO techniques in the beginning and that’s it. Their website will jumps ranks and reach at the top.
While your website may hold a solid position this way, but it would be temporary. SEO is a technique that require continuous implementation, changes and updates.
This is because more than 1 billion websites are already there on the internet competing with one another due to which newer algorithms are being introduced and SEO techniques needs to change with those new introductions.
  1. Tricking Google Is Easy
With $109.65 billion yearly revenue generated by Google, how can you think that tricking them will be easy? It’s next to impossible.
They have some of the most creative and out of the box thinking minds working for them and no black hat tactics will work to bring your site to the top. The only way is to learn and implement good SEO and walk by the rules which Google has implemented.
You can also hire a good SEO agency to do the job for you.
  1. Keyword Stuffing Still Works
If you think that forcing keywords will help you improve your rankings then you’re wrong.
Google’s algorithm doesn’t work like that anymore. Today, keywords need to come naturally in the content so that the piece doesn’t seem promotional.
The keywords density per article should kept at minimum, that is, 1% to 2% only.
  1. Spamming With Poor BackLinks Is A Good Practice
Do not go for a huge number of backlinks, what matters here is the quality. One backlink from a good website is better than a dozen backlinks from poor sites.
The Verdict
If you want your website to sit at the top rankings then make sure not to fall prey to these myths.

Saturday 24 March 2018

13 Things You Need to Include in SEO Contracts


13 Things You Need to Include in SEO Contracts
SEO contracts can be confusing, controversial, and a sticking point that we all quickly want to get past in the search industry.
They don’t have to be, though, as they can be incredibly useful tools on the front end of an engagement to get everyone on the same page.
By making them useful and realistic, we can find fair terms for all parties.
The tone of our relationship with clients is often formed in the contract and negotiation phase.
If the contract is a useful tool and looked at as the next step in the conversation to get started, it can be less of an experience focused on legalities and one that is more about defining work, roles, and expectations.
When both parties end up with a feeling of mutual benefit from making the deal, it leads to solid expectations being set on what is covered, what isn’t, and how the work will be done.
You want everyone to feel protected and comfortable at this phase.
The components of the contract are essential to ensuring that you are thorough, but realistic in the service you are offering.
Here are 13 components that you should include in any SEO or digital marketing contract.
seo-contracts

1. Info About Your Company

Take advantage of the opportunity to reiterate what you have already pitched or shared about your company or consultancy. You bring unique perspective and value to the relationship.
Don’t take for granted your opportunity here or assume that your prospective client isn’t reviewing two or three additional contracts or that occasional last-minute “friend-of-a-friend” proposal that might come through while your contract is in front of the prospect.
If your proposal is put side-by-side with one from another company, do you have enough content about your company and unique approach to compete?
Sure, you might still need a lot of legal stuff, but don’t forget that this is one more place to make your final pitch or reminders of why you offer the best option.

2. Definition of the Team Involved

Be clear about who on your team will be doing the contracted work.
This is your chance to set expectations for what the onboarding process will look like and how a transition from owner or sales-person to the search marketing team will happen (if a hand-off is going to happen).
If you’re selling the services but don’t do the day-to-day work, you want that to be clear so your new client knows they shouldn’t expect to call you for every detail of the campaign going forward and not to be surprised when new people are introduced into the relationship.

3. References

Similar to a resume, I recommend adding professional references and contact info to the contract.
By offering this up front, you’re showing honesty and trust by not waiting to be asked.
Just be sure to let the reference contacts you list know that they might get some emails or calls so they aren’t surprised (or annoyed).

4. Case Studies or Work Examples

You don’t need to let this content get in the way of the necessary components of the deal.
However, like the “about” section noted above, it is great when you can add one more reassuring section with links to relevant case studies, testimonial quotes, and other examples.
This is important especially if you’re responding to an RFP or know that you’re up against other competitors. Your work and unique approach help you differentiate on areas other than just price.
Even if you have pitched and presented this previously, it is possible that not all people reviewing the contract were involved in your previous interactions.

5. Clearly-Outlined Deliverables

There’s a time and place for legal terms: at the end of your contract (if necessary).
Keep your scope and deliverables section as client-friendly as possible.
The more confusing and legal you make the terms of what you’re doing in paid or organic search, the more you might overwhelm or confuse your prospect.
I have experienced instances in the past where the subject matter detail and stipulations tripped up a prospect or made them uncomfortable and caused them to go silent and pause the process.
Also, stick to your position on what your process and deliverables are.
Remain firm on your personal and company moral and ethical code even when pressed for guarantees or things that you aren’t comfortable including.
The more technical and legal your deliverables are, the more legalistic your service offering and campaign results later will be judged when we know in search that nothing is guaranteed.
Clients often seek assurances on when rankings and traffic will be produced and you have to answer the question with confidence.
However, what you’re legally agreeing to in the contract should not violate search engine guidelines or put your company or the clients at risk.
Plus, you want to keep the conversation around the business goals of SEO and not on the literal ranking positions as they are not the end goal.

6. Timing of Delivery

Stating when you are starting the work, when specific phases will be completed, and what the overall term of the agreement provides perspective.
The Best of SEJSUMMIT Webinar - Wed, March 28, 2:00 PM EST
Join Duane Forrester on the first-ever BOSS webinar as he offers actionable takeaways and thought-provoking content on the future of voice search.
ADVERTISEMENT
If you are vague and the agreement is signed, you risk running into issues of when the client thinks specific activities are going to happen as the expectations may not be timed based on your process and methodology.
If you’re doing an audit or setup first, make that clear. That is valuable work that won’t necessarily result in the implementation of the strategy and drive traffic right away.

7. Pricing Definition & Structure

Whether you use a flat rate, have a blended rate, a revenue share, commission structure, have a flat management fee – or any other structure for pricing – give enough detail to answer any potential questions before they are asked.
Pricing can be a sensitive and make-or-break aspect of an agreement. You want to be sure about what your model is knowing the pros and cons of different models.
Confusion and obscurity can create questions in the mind of your prospect that lead to a bad impression or trust issue that is hard to get past.

8. Invoice Schedule

To avoid questions and confusion when you start billing, you want to spell out when invoices will be issued, what the payment terms are, payment options, and how installments will work.
If you’re billing prior to work or in arrears, that should be defined. This is an area that gives you some flexibility if you have it on your end.
If you have an initial setup fee, you can negotiate spreading it out over time or break out into more even monthly billing installments if that helps the client and wins the deal.
This also sets clear expectations so you don’t have to answer questions or settle terms when you’re well into a campaign.

9. Cancellation Terms

Regardless of whether your clients can cancel at any time or if they are in a long-term contract, you need to define what the term is and what the process is for terminating.
Is there a mutual opt-out clause?
Are there penalties for early termination?
Define those now so in the unlikely event they are exercised, everyone is clear on what steps are required and what financial obligations exist.

10. Legal Details & Contingencies

In most cases, you have some legal details to include in your agreements. In some cases you might have unique or custom contingencies built into the contract.
I recently wrote in a clause that if any key members of the client’s team leave my organization that I will waive the standard cancellation period and financial obligations.
This gives the security to know they can walk away if they aren’t comfortable with the team they are working with if there’s a disruption.
I don’t do this for everyone, but in certain cases, I feel comfortable and confident to write that in.
For cases like “act of God” and worst-case scenarios, defining what is necessary legally can provide assurances for the investment your client is making.

11. Ownership of Work

To this day I continue to onboard clients who are burdened by difficult situations regarding getting their AdWords, Analytics, and other accounts transferred.
Even if you aren’t working with a client who has been burned in the past and isn’t questioning your motives, it’s a good move to include details on who owns what.
Do you own the reporting accounts, creative, and tactical plan? Does the client?
If you spell that out in advance you can reach a comfort level on what separation would look like and if you’re giving the client ownership, winning some goodwill in the process.

12. Scope Change Process

Set the tone for the possibility of expanding your contract down the road by identifying how scope changes will be handled.
What does it look like to change scope?
If you have to do a brand new contract and reset cancellation terms, that’s important to note up front so when you add the scope, you’re making the process easier.
If you just need a change order, define what that process looks like. Make it clear and easy to understand to save hassle and intense new negotiations down the road.

13. Expiration Date

There are two reasons to state an expiration date in your contracts.
  • You want your prospect to take action and execute the agreement now.
  • You reserve the right to change your rates and outside factors may necessitate changing your process and the scope.
By setting an expiration date, you can give yourself room to make changes or create a new contract if the prospect stalls on signing.

Conclusion

Overall these 13 contract components might seem like a lot.
Well, that’s because they are.
However, the SEO contract can be so much more than just a legal document. It can serve as a useful tool at what is often one of the most important steps in the relationship building process with a new client.

SEO Ranking factors panel: SMX West session recap

Contributor Eric Enge recaps the controversial takeaways and interesting marketing tips from the SEO Ranking Factors session at SMX West.

I attended the SEO Ranking Factors session at this year’s SMX West and enjoyed listening to the panelists present their opinions, experiences and ideas.
Any discussion of ranking factors raises a lot of emotions and controversy, and this was no exception. It is important to understand how to use information from these types of sessions: None of the data is fully conclusive, and it should always be taken with a grain of salt.
What it does tell us is the types of things that matter in the market, not what factors directly drive rankings.
That said, our panelists shared some great insights, so let’s dig in!

Olga Andrienko of SEMrush

Olga was the first to present. She covered her company’s recent ranking factors study where they looked at the top 100 positions across 600,000 keywords and segregated those keywords by competitiveness.
Unlike other ranking factors studies that rely on traditional correlation measurements (e.g., Pearson or Spearman correlation measurements), SEMrush used a machine learning model called Random Forest.
This is a model based on random dataset sampling to build decision trees. The approach involves taking several different data samples, building models with that data and then averaging the results across all the samples.
This approach can result in much better fits to the data in the training set but does carry with it some risk of “overfitting” the data. The study looked at several on-page factors like links and traffic.
I’m going to pull the points from Olga’s talk I found most interesting.
The first observation I’ll highlight from the study relates to the use of keywords in the title tag. Here is the slide from her deck:
As you can see, the study reports the keyword in the title in only 35 percent of the pages they studied.
The chart also shows the percentage is far higher for the most competitive keywords. Notice the yellow dotted link is over 60 percent!
This follows what intuition would tell us: Keywords in the title matter more for competitive terms.
Keep in mind this relates to using the exact keyword phrase in the title.  Titles that use related words, or uses all the words in a different order, will not show as a match.
The next point I found interesting relates to how keyword-rich anchor text is used. Here is the slide:
The big point that emerges from this slide is how infrequently keyword phrases are being used in content that ranks well, even for high-volume terms. There is some great good news in this: It suggests link spamming on the web is down. As an industry, we should celebrate that. It also suggests ranking highly can be done without getting a lot of rich anchor text links to your page.
However, you can also infer getting keyword-rich anchor text to your page can help if you’re trying to get to the number one spot. Just don’t use that suggestion as a motivator to start a spammy link campaign!
Content length is another issue the study addressed, as shown here:
In spite of all the discussion around making content shorter for mobile, this data suggests longer content length helps with ranking. The study shows the top three ranking positions host longer content.
In spite of all the things we hear about people wanting short, quick reading content, it doesn’t seem that’s really the case.
The SEMrush study also addressed the topic of page speed, finding higher-ranking pages across all the keywords groups loaded significantly faster than the lower-ranking pages.
I found the chart on “time to first meaningful paint” particularly interesting:
First Meaningful Paint measures the time it takes for primary webpage content to first appear on a users screen. See how the data shows the ranking curve favoring faster sites? The chart for time to first meaningful activity showed a similar result.
This data does not mean speed is a direct ranking factor, but other studies have shown people like faster-loading sites.
Last but not least, I found the slide on links quite interesting:
The data supports the notion links still matter a great deal in terms of determining where a webpage ranks.
We also see the number two position has quite a large peak of links across all keyword groups. I’m still trying to decipher that one.
Here is Olga Andrienko’s presentation:

Marcus Tober of Searchmetrics

Marcus focused on two themes:
  1. The idea that ranking studies are harmful to the search industry.
  2. The importance of performing ranking analyses on an industry-by-industry basis.
For the first point, I think it’s safe to say, a ranking study can do harm if you apply or believe the implications blindly.
Just because higher-ranking pages are faster, it doesn’t mean they drive ranking. But it could be a strong indicator that having faster pages is important to ranking well.
One of the points he used to illustrate this was content length. Just like the SEMrush study, Marcus reported longer content appears to rank better.
In fact, the length of content in the top 10 of the search engine ranking pages (SERPs) appears to be going up over time:
During much of his presentation, Marcus reviewed how different market sectors vary. The sectors he looked at were dating, wine, recipes, furniture, car tuning and divorce.
For his second point (the importance of performing ranking analyses on an industry-by-industry basis), Marcus explored the role of microdata and using it to compare dating, recipes and divorce segments:
The recipes segment has the highest usage of microdata by far. That is not surprising since Google lets you mark up recipe content with structured data, which, in turn, provides rich results and increased search visibility.
Another one of Marcus’s points focused on video:
I really like this point because it shows how you need to tailor your content based on the market you’re in.
If you’re in the fitness business, videos make a lot of sense, but not so much for divorce or wine.
This highlights how studies may not always be focused on ranking factors but still share insights which offer value.
Another interesting observation Marcus shared was about content length across the divorce, fitness and furniture markets:
Notice how content on divorce sites tends to be longer, while fitness is far behind. This shows that the need for longer content varies by market.
People looking for wine don’t want to read, they want to drink!
In contrast, though, when it comes to images, Marcus compares the divorce, recipes and furniture markets:
See how the furniture market is the clear leader here, which makes sense. When you’re selling furniture, you need to create a visual experience with lots of choices.
For recipes, visuals are important, as people want to see what the food will look like. But in the divorce market, images are not nearly as useful.
Here is Marcus Tober’s presentation:

Chanelle Harbin, Disney/ABC Television Group

Last, but not least, was Chanelle Harbin who shared a series of real-world case studies.
The first case study related to structured data. In this case study, Chanelle and her team implemented VideoObject schema across video pages. They also exposed closed captioning to the search engines, with these results:
The scale of growth of visits and page views is impressive! This is also consistent with what I have seen across other schema.org tests that I’ve participated in.
The American Broadcasting Company (ABC) also measured the results when they implemented structured data cross recipe pages for the television show “The Chew.” The results here were also spectacular:
I am a strong proponent of the value of featured snippets. Chanelle shared some great data from what they saw when they obtained a featured snippet for the phrase “Beyoncé guacamole recipe”:
Now, that’s a serious uptick in traffic!
ABC also participated in some good old-fashioned link building. Here are some of the key actions they took:
  1. Reached out to sites mentioning shows and requesting those mentions get converted to links.
  2. Noticed broken links intended to target key pages and then asked those people to fix the links.
  3. Contacted sites that had very similar content that ranked highly.
These are all very basic practices that any site can take on, and the results that ABC got were stellar. Here are a couple of examples:
Last, but not least ABC ran tests to compare the performance of non-AMP pages with accelerated mobile pages (AMP). Here are the user engagement results they saw from these tests:
Here are the improvements in traffic:
Bear in mind that AMP is not a ranking factor, but as ABC is a news site, participation in AMP made them eligible for the AMP news carousel on smartphone devices.
It’s been suggested that Google may let non-AMP pages appear in the smartphone news carousel and Top Stories on desktop in the future. Even if that does happen, the user engagement benefits ABC received will remain in place.
Here is Chanelle Harbin’s full presentation:

How to Write a Great Title Tag for Google & Users


How to Write a Great Title Tag for Google & Users

Find Your Top Traffic-Driving Keywords (Free Tool)

Get awesome data with our FREE domain research tool. Find your top keywords, and spy on your competitors in less that 30 seconds. Click here to get quick insights.
ADVERTISEMENT
Editor’s note: “Ask an SEO” is a weekly column by technical SEO expert Jenny Halasz. Come up with your hardest SEO question and fill out our form. You might see your answer in the next #AskanSEO post!

Today, we’re talking title tags. Here’s this week’s Ask an SEO question:
I would like to ask how you write a title tag today. Would you be using: primary keyword | alternative keyword | company name? (exact phrase) or would you drop the keywords in the title tag in any order?
If the keywords are residential builders and commercial builders, would you write it as:
Residential Builders – Commercial Builders | Company Name
Or would you be writing it as:
Residential, Commercial Builders in Location | Company Name
What is your take on this? How does Google read title tags today when you search for terms?
Ultimately, the only way you’ll know for sure what works for you is by testing and measuring everything you do in SEO.

Optimize Your Title Tag for Search

While I can tell you that Google’s spider accesses words in the order in which they are presented in the code, that doesn’t really answer your question and could lead you down the wrong path.
Lots of SEO professionals will tell you that the title tags should have the most important keyword first, and then the company name at the end, if you choose to include it.
This is generally a good way to start because it makes sense based on what we know about how Google’s spider parses code.
However, what’s far more important is how it actually performs.

Optimize Your Title Tag for People

Google needs to present a title tag that will compel users to click on the result and then hopefully have a great on-site experience. That’s why Google sometimes changes the title you have provided.
You should try some different ways of presenting the information based on what you know.
  • Is the prospective customer likely to see some value in your brand name? Then you should probably include it in the title.
  • What about the location? Important for builders and dentists, less so for T-shirt sales.
You don’t need to do an exact match in your title tag.
“Residential Builders – Commercial Builders | Company Name” is just awkward.
You need to consider the device the customer is likely to see your ad on.
If it’s a desktop or tablet most likely, you have more room for the title.
If it’s a mobile device, and you’re doing SEO for a restaurant, you better get the name of the restaurant in the front of the title.

Add Important Information in Your Meta Description

Keep in mind you also have the meta description tag to communicate key information.
Without knowing anything about your company or your location, I suggest you emphasize the more important keyword and the location first. Something like:
TITLE: Residential Builders in City | Company Name
The Best of SEJSUMMIT Webinar - Wed, March 28, 2:00 PM EST
Join Duane Forrester on the first-ever BOSS webinar as he offers actionable takeaways and thought-provoking content on the future of voice search.
ADVERTISEMENT
META DESCRIPTION: Home and commercial builders, licensed in state. [Another sentence that contains a key selling point.] Contact us for a free estimate today.

Optimize Title Tags on Every Webpage

If you also do commercial building, I suggest emphasizing that on a different page.
Remember, you have lots of different pages on your website, and the home page is rarely going to be the most relevant.
If someone is looking for a commercial builder, they’re looking for a much different experience than a residential one. Commercial is all about efficient, licensed, bonded, budgets, etc.
While these things are also important to people searching for residential builders, they’re more interested in beauty, comfort, quality, and probably price.
Commercial sales will be mostly made based on facts, while residential sales will be mostly made based on emotion.

International Search Engine Optimization (ISEO) Expert Chris Raulf of Boulder SEO Marketing to Teach an SEO/ISEO Master Class in London, England on April 10, 2018

Boulder SEO Marketing, a provider of SEO training and consulting services, announced today that its founder and president will teach a half-day SEO/ISEO master class on April 10, 2018, in London, UK. Josh Steimle of MWI, a globally operating digital marketing agency, will also teach a half-day digital PR master class. Taught by two of today's top digital marketing experts, attendees will walk away with actionable insight into how SEO/IESO and digital PR can help increase a business's digital visibility, resulting in increased sales. Additional information and registration are available by visiting
"This digital marketing course in London is a great fit for anyone wanting to improve the visibility of their website in Google and other search engines," notes Boulder SEO Marketing founder Chris Raulf. He adds: "It also specifically targets language service providers who are interested in offering multilingual and international SEO to their website localization customers."
Chris will share proven methods of how to optimize a website to rank naturally on page number one on Google and other search engines for local, national and international businesses alike. Topics covered include:
  • Digital marketing strategy development
  • Keyword research for SEO
  • On-page and off-page SEO best practices
  • How to perform an SEO website audit
  • International and multilingual SEO
Josh, a writer, journalist, and digital marketing expert will share how to produce marketing strategies and campaigns that get results and what kind of content top digital publications are looking for.
Attendees will leave with an actionable strategy for getting a business featured in digital outlets.
Attendees can either attend both sessions or choose between the two classes. Attendees who attend both sessions will receive complimentary access to a top-rated 7-hour long self-paced online SEO course.
About Boulder SEO Marketing
Now officially also a London SEO agency, Boulder SEO Marketing features offices and training facilities in Denver and Boulder, Colorado, and in London, UK. We take pride in offering 5-star rated in-person and online SEO training courses for all levels and business professionals worldwide rely on our digital marketing training and consulting services. We work with companies from around the globe to implement strategies that will boost sales from their online marketing efforts.