SEO

Mobile Friendly Search Results are Coming

mobile friendly websites for lawyers

Here are the two main reasons you should make sure your website is mobile friendly:

  1. Mobile traffic – Give your potential clients who come to your website a better experience.  We’ve been following mobile traffic in Google analytics for the past 5 years, in that period of time we have seen mobile traffic jump from a range of 1% to 5% (depending on the market) to 20%-35%, with some clients seeing mobile traffic in above 45%.
  2. Google has announced that they will start to value mobile friendly websites in their rankings. For the past few years, people have speculated whether Google would move in this direction based on various indicators and signs; but up until this past Feb it was only speculation. Click here to read the announcement, where Google explicitly states that they “will be expanding their use of mobile-friendliness as a ranking signal”.

To test if Google considers your website to be “mobile friendly” click here.

Keep in mind that if you have a great website and don’t want to go through the process of re-designing it from scratch, we are able to convert your website into a responsive theme. Depending on your design, some or many elements may need to change… but we can often maintain the overall look and feel of your design.

If you would like to explore a new responsive website, or converting your existing website to a responsive theme – call us at 800.267.1704 and one of our account managers will help you determine your options.


What Google’s Patents and Acquisitions Can Teach Us About SEO

seo for lawyers

Matt Cutts, the head of search spam at Google, recently released a video in which he discussed a future where links weigh less into the ranking of a website. To save you 3 minutes, Google knows that as long as they’re dependent on links for the majority of their algorithm, it will always be easily manipulated. So it would make sense for them to try everything they can to get away from links as the primary factor in ranking.

The First Step Away from Links as a Backbone

Back in 2005, Google filed a patent called Agent Rank for a technique that would allow them to rank a piece of content based on the person who authored it. In theory, when content is added to the web, there would be a digital signature connecting the real life author to the database’s digital author profile. This signature could be unique, and attached to every piece of content that author puts on the web, creating a portfolio of sorts for each author.

An authority score given to different authors, or “Author rank,” could then be used in the future to give weight to new articles and content authored by the author. For instance, the author could launch a brand new website, and because their author rank is high across 20 other websites, Google would associate this author with other great content and potentially give more authority to that website, even though it is new.

seo for lawyers

Google Authorship (Click to Enlarge)

Some websites, though, don’t include authorship info, and Google doesn’t have any good way to assign value to these pages outside of links. But that could change with Google’s acquisition of Deep Mind, home to the world’s leading researchers involved with artificial intelligence and deep learning. The term “deep learning” has come around since the mid-2000’s to describe a programming architecture in which it could make connections between different sets of data. It would make sense then, that it deep learning is most effective when it has large quantities of data to sort through and analyse.

How Google Will Come To Know Us Better Than Ourselves

Well, as of Jan 2014, Chrome has a dominating market share at 55% of all internet browsing, Google Analytics is on over 15 million websites, the Ad Network reached over 2 million websites, and Gmail is the leader in web mail, so it’s no secret that the amount of data that Google is able to collect is simply unfathomable.  Up until now, sorting through so much data and drawing informed conclusions has been troublesome for computers. From TechCrunch,

World-renowned artificial intelligence expert and Google’s new Director of Engineering, Ray Kurzweil, wants to build a search engine so sophisticated that knows users better than they know themselves. “I envision in some years that the majority of search queries will be answered without you actually asking.”

law firm seo

But now, Deep Mind’s AI program will play a role in all of Google’s infrastructure, including search, advertising, and social. With the end goal to document and draw smart connections between the real world’s people, places, events, and things, we must assume that Google is going to be using their mass repositories of data to create individual user profiles for each of us – including authors and readers.

Big Data + Deep Learning = Personalized Results

Users will see an increased level of relevancy in searches. For instance, if someone has emails in their inbox discussing the purchase of a new Honda Civic, and then they like Honda on Google+, and finally post pictures of their new Honda Civic to G+ with hashtags, then when that user goes to search for “change spark plugs”, Google will tailor the search results to include videos and tutorials specific to the user’s history, which is changing spark plugs on a Honda Civic.

Conversely, authors will see an increase in engagement from their users. Let’s say I’m a mechanic and operate a blog detailing simple maintenance on Hondas and Toyotas (Japanese cars). Google sees that I frequently discuss related topics to car maintenance, and that I mention Japanese name brands, not American. Hopefully the users who are searching for “how to change my car’s oil” and own Fords won’t see my blog, and by the same token, the time users spend on my site will likely increase because the content is more relevant to their lives.

In the future of the internet, where digital and real life become more integrated, it will be important that we associate ourselves and our businesses with others that are considered to be industry leaders, in hopes of being given credit and benefit of the doubt based on association. As such, we should all start building a digital reputation for ourselves now, so that we aren’t behind when the time comes.

May 8th, 2014 – Posted by to Search Engine Optimization.
To contact the author, emails can be sent to: blewis@thesearchengineguys.com


Yelp! I Need Somebody

Since the website Yelp (www.yelp.com) was relaunched in February of 2005 with a decided focus on providing user reviews of businesses and professionals, the site has become one of the most frequently visited sites on the web, as people look to the reviews provided by others in their communities before going to a new mechanic, visiting a new doctor, or even eating at a new restaurant. For a while, these reviews had a limited effect, and only seemed to affect the ranking of businesses within Yelp’s internal search engine.

However, since Penguin 2.0 rolled out back in May of this year, we’ve noticed an interesting trend among search results that appears to be increasingly prevalent: the high ranking of directories for many search terms in Google. Directories like Angie’s List, Yelp, and even the Better Business Bureau are showing up on the first page of Google – often in the first and second spots – for extremely competitive keywords. Click here to see an example.

What This Means for You

If you haven’t claimed your business’ Yelp listing, do it now. You could be missing out on valuable opportunities to get your name out in your community and to rank higher for your targeted keywords in Google and on Yelp. Additionally, there does seem to be a correlation between the number of reviews a business has and how strong their rankings are, both inside and outside of Yelp.

Thus, if you have satisfied customers, encourage them to leave you a Yelp review. This seemingly small action could have significant effects on your placement online, making it well-worth the effort.

amy  September 16th, 2013 – Posted by to SEO.

To contact the author, emails can be sent to: amann@thesearchengineguys.com


Google Hits June With A Heat Wave

lawyer seo for attorneys

Ever since Matt Cutts, Google’s head engineer in charge of web-spam released a video in May about what to expect in the next few months in terms of SEO, there have been noticeable fluctuations in the search results for many website owners.

The MozCast Google weather tracker is a tool designed by the highly regarded web-marketing company Moz. Recently, their “Mozcast” site displayed what many SEO experts consider to be alarming temperatures in the month of June. A “regular” temperature reading is usually somewhere between 50 to 80 degrees, but two readings this month have shot up over 100 degrees, breaking records in the weather chart. In fact, June 27, 2013 yielded a 120 degree reading, the highest ever seen in the history of MozCast. Overall, changes in the Google algorithm have shaken up the search results a few times this past month, including roll outs such as:

  • 10 Day Panda Monthly Update
  • Payday Loan Algorithm
  • Partial-Domain Match Update
  • Multi-Week Update

10 Day Panda Monthly Update (Panda Dance)

In March, Google mentioned that they will stop publicly announcing Panda updates, as the algorithm will continue to roll out monthly. Google is currently pushing out Panda updates over a 10 day period every 30 days.

What does this mean exactly? Google pushes the update on a specific day, so from that start day, the algorithm will continue to push out over a 10 day span. The push will continue to repeat itself every month.

Google has been working on refining the Panda algorithm to help sites that are lingering “on the border” of being impacted by the update. Cutts mentioned that they are “softening” the Panda algorithm by adding signals to search for quality metrics on these types of websites. It is unclear as to how soft these roll outs will be or how much change will take place in the search results.

Payday Loan Algorithm

Cutts also mentioned in this video (at 2:30) that there will be a new search update targeting “spammy queries.” Roughly a month after the video was posted, Cutts sent a tweet out confirming his statement in the video.

Certain industries, one being payday loans, are infamous for abusing Google’s algorithm by using automated software to build quick backlinks within a short amount of time. Of course, these websites do get caught after some time and lose their rankings once detected. Once the website gets flagged by Google for their unfavorable link building tactics, these companies then toss the old website, start over with a new website, and repeat. This “churn and burn” method is considered illegal and has been on Google’s action list to clean up.

This algorithm update is aimed to target link building and spam tactics globally. Other affected search terms include “car insurance” and pornographic related queries.

Partial-Domain Match Update

On June 25th, the MozCast reached 113 degrees, ousting the previous high of 102 degrees set on December 13, 2012. In this blog post by Dr. Peter J. Meyers from Moz, he discusses case studies monitoring de-personalized and de-localized queries.

Meyers conducted two different studies on the search terms “limousine service” and “auto auction.” Both cases showed similar patterns that indicate a partial-match domain update.

Multi-Week Update

While the temperature in MozCast is experiencing high levels of fluctuation, Cutts also threw a Multi-Week update into the mix. Details on this update haven’t been confirmed yet, but there have been speculations as to what kind of update this will be. Will it be a follow up to the Payday Loan Algorithm or is this PMD update just a trigger to something of greater impact?

Cutts announced the rollout was happening on June 21, 2013 and will continue to affect the search results until the week after July 4, 2013. We’ll keep an eye on this while this update keeps rolling out through the month of July.

Why does Google push out algorithm updates so often?

Google strives to improve the quality of search results for the user by keeping quality sites ranked high and devaluing sites that prove to be harmful or untrustworthy. To assure that Google users have the best matching results for their queries, algorithm pushes are necessary to keep spam and questionable websites off your results page.

What should you expect for July?

There will be constant algorithm fluctuations this next month, considering the turbulence that occurred in the last week of June. If the pattern continues with the Multi-Week update, we could be approaching a few more stormy SEO days. Look out for:

  • Advanced spam detection as Google is constantly fighting off low quality and irrelevant websites
  • Google working on improving malware detection
  • Due to the consistent Panda roll outs, Google will be working on improving this update by adding new metrics and quality signals

What does this mean for you?

If your site has lost placement recently or has been moving around in the SERPs, the high number of algorithm changes that rolled out in June may be the reason. While we expect things to start settling down soon, the effects of these updates are likely to continue for a little while longer.

Nancy Tran   July 3th, 2013 – Posted by to Search Engine Optimization.

To contact the author, emails can be sent to: ntran@thesearchengineguys.com


Preparing for Penguin 2.0

We have been through 5 major Google algorithm updates in the last 6 years and dozens of minor updates. Google recently stated that “We make over 500 changes to our algorithms a year, so there will always be fluctuations in our rankings in addition to normal crawling and indexing.” Additionally, SEOMoz reports that there have been 76 notable algorithm updates since 2007. Most of the minor updates go largely unnoticed by everyday users of Google and may feel more like typical fluctuations due to the content changes in the index.

Major updates are more like 50 year storms, and during major updates, it’s not uncommon for sites that enjoyed dominant first page positions to drop out of the top 10 pages.

Currently, the industry is buzzing with talk of a major update that Matt Cutts has labeled Penguin 2.0. Given the buzz, we thought we would share our 10 cents on how to handle the next “big one,” whether it happens this week or in 10 weeks:

  1. Cultivate a healthy paranoia. Most in the SEO community know what this means, because in the aftermath of a major update, the chaos and confusion is thick. During this time it’s important to question everything you read. Make sure you ask yourself if the information is coming from a “talking head” who is talking about what happened or from a web master with skin in the game. Be skeptical of statements of fact and leery of predictions. Early statements may very well be true, but to know for certain, tests need to be run to validate and verify them.
  2. Don’t over-react. Let the dust settle before you draw conclusions. It is tempting to be shortsighted and draw knee-jerk conclusions during a major algorithm update, but try not to. Conclusions should be formed, but not in the opening days, or weeks following an update. If past updates are good indicators of what will play out (they may not be), it will take a few months for the SEO community to know what happened and how to proceed.
  3. Be Proactive. Since we know that in the past, the Penguin update generally affects a website based on its backlink profile, it is easy to audit websites routinely to make sure that your site is consistently meeting the quality standards set to keep it from being penalized.
  4. Have an alternative traffic plan in place. The fastest way to replace lost traffic short-term (if your site has lost organic placement) is via Google Adwords and other CPC platforms (Yahoo/Bing, Facebook, LinkedIn). Consider industry directories and industry specific email/phone and lead-generation platforms.
  5. Give it some time, but not an indefinite amount of time. Every time a major update occurs, the SEO community goes into “all hands on deck” mode for months. There is stress, panic, uncertainty, theories, frustration, and resignation. However, about 6 weeks out, the new system begins to be more clear. It’s important for clients to be in communication with vendors during this time, but not daily or even weekly. Every 10-15 business days is about right for the dots to start connecting.
  6. Be willing to adapt. Accepting that what was true yesterday may be false tomorrow is painful. Be data driven, not “hunch” driven. Just because you think Google might have done X, remember that it’s only a theory until you test it multiple times and verify it. This has always been the reality of how Google ranks websites. For whatever reason, though, people have a hard time accepting this fact. It takes humility to accept that Google holds the keys.

Ultimately, when the algorithm is updated, the best course of action is to take a deep breath and evaluate everything that changes. Compare what the sites that were penalized have in common and what the websites that held strong have in common. Take action appropriately and with time, the results will return.

Here is a helpful video from Matt Cutts on Penguin 2.0 and what changes to expect in the coming months.

 


Google Chairman’s Predictions Hint at AuthorRank

Excerpts from an upcoming book by Google Chairman Eric Schmidt were published by The Wall Street Journal last week. In the article, Schmidt laid out his seven predictions for the future of the digital age, but for marketers one sentence stood out from the rest:

“Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in more users naturally clicking on the top (verified) results.”

To many, this seemed less like a prediction and more like a veiled confirmation of what marketers had long suspected: AuthorRank is coming.

Great AuthorRank graphic by Mode Digital.

Great AuthorRank graphic by Mode Digital

The AuthorRank saga began in 2005 when Google filed a patent for something called “Agent Rank.” The document described how the search engine could use a number of metrics to determine an “agent’s” position within a subject area. By outlining a way to consider an agent’s popularity and authority within a given subject area, marketers inferred that Google was looking to supplement the cold statistics of search with human factors.

Traditionally, Google had not had access to enough data to warrant using social interactions as a direct ranking factor. The company found a way to solve this problem in 2011 with the introduction of Google+. With its social network providing access to a trove of qualitative data, the logical next step was to incorporate it into search. Thus, AuthorRank became a reality.

Simply put, the goal of AuthorRank is to determine the credibility and popularity of an individual and the content they publish. Many factors that will likely have an impact on AuthorRank are old-hat for SEOs, such as: the number of followers on social networks and the frequency of shares, as well as the number of links, Likes, tweets, etc. The difference, however, is that Author Rank ties these metrics to the individual who publishes the content – not the website that hosts it.

This change has huge implications in the SEO world, but the first step for anyone marketing online is to claim authorship of their content. Any content a marketer has created should be tied to a verified Google+ profile. This means an author’s Google+ profile must have a link to the pages that host their content, and vice versa. Once this is done, the long climb to dominant Author Rank begins.

Everyone in the SEO industry is anxiously awaiting Google’s Panda Update 25. It is not yet known if this specific update will further the push from Page Rank to AuthorRank, but Google is clearly headed in that direction. The web strategists at The Search Engine Guys have been preparing for the move to AuthorRank for some time. If you have questions about SEO, AuthorRank, and how to prepare your website, please contact us today.


Google to overtake Facebook in Display Advertising

According to research firm eMarketer, Google is likely to exceed Facebook in selling online display ads in the United States.  Google is expected to have a 15.4% share of the U.S. market. eMarketer said  Google is projected to make $2.31 billion in revenue from online display ads. These are more profitable than the text-based ads that appear next to search results and account for the bulk of Google’s revenue.

This lead in online display ad marks a historic day for Google. This is the first time ever that it will be the leader in three different modes of online advertising: display ads, web-search ads, and mobile ads.

eMarketer calculates that Facebook will hold 14.4% of the market this year with $2.16 billion in U.S. revenue. Back in February, eMarketer predicted that Facebook would be on top with 16.8% of the market and Google with 16.5%.

 

 

(Graph from The Wall Street Journal)

eMarketer estimates that the display ad market to grow 21.5% to almost $15 billion in the U.S. this year, compared to last year’s $12.3 billion. Collectively, Google and Facebook will have nearly 30% of this year’s display ad revenue. In the year 2014, eMarketer predicts the two companies to have 37% of the market.

Google continues to make it easy for advertisers to use one source for all of their online marketing needs, via traditional desktop Adwords, Mobile adwords, display ads and re-marketing – both within their search network and on thousands of partner websites within Google’s content /display network.

Contact us at The Search Engine Guys if you would like to explore options for PPC advertising on Google’s network.


Microsoft’s new “Bing it on” marketing campaign challenges Google

As reported by tech-news website The Droid Guy, Microsoft is adopting the methods of the Pepsi Challenge in their new “Bing it on” challenge against the Google search engine. The challenge pits Bing up against Google in a side-by-side comparison (with the brand names removed) to see which service provides better and more relevant search results.

According to an independent study that sampled nearly 1000 people across the United States for 10 rounds, users preferred Bing to Google almost 2:1. Out of the total amount polled, 57.4% chose Bing, 30.2% chose Google, and 12.4% were split. On Bing’s site, there are only 5 rounds.

Find out what you prefer at http://www.bingiton.com/


Cloud CEO Speaks at M&L Legal Management and Marketing Seminar

Last week Joe Devine, CEO of Cloud [8] Sixteen, Inc., spoke at the 2012 summer session of M&L Legal Management and Marketing forum. The goal of these seminars is to provide an arena in which different personal injury lawyers and law firms can gather together and share ideas on how to be successful in their industry. Since 1992, Marshall Hughes and Lee Coleman have been sponsoring these events. Past seminars have taken place in Aruba, Utah, and the Dominican Republic.

This year’s conference took place at The Fairmont Hotel in Vancouver, British Columbia from June 21 – 23. The topic of Joe’s talk was “The Art of Designing Your Website for Maximum Conversion.”


Cloud [8] Sixteen, Inc., CEO Speaks at Annual WTLA Seminar

Last week, Joe Devine, CEO of Cloud [8] Sixteen, Inc., spoke at the Western Trial Lawyers Association’s annual seminar, lasting from June 11th to the 14th, at the Fairmont Kea Lani in Maui, Hawaii. The theme of this year’s conference was “Pride, Passion & The Practice of Law.”

The conference had a number of guest speakers discussing various topics, and Mr. Devine spoke about the benefits that search engine optimization and live chat can have specifically for attorneys’ practices. This is the third year that Mr. Devine has spoken at this conference.

Ready to start working with us? Call (512) 806-7955 or Email Us Now