Here are the two main reasons you should make sure your website is mobile friendly:
To test if Google considers your website to be “mobile friendly” click here.
Keep in mind that if you have a great website and don’t want to go through the process of re-designing it from scratch, we are able to convert your website into a responsive theme. Depending on your design, some or many elements may need to change… but we can often maintain the overall look and feel of your design.
If you would like to explore a new responsive website, or converting your existing website to a responsive theme – call us at 800.267.1704 and one of our account managers will help you determine your options.
On July 24, 2014, Google released an update to improve its local search algorithm. This new update is intended to leverage traditional web ranking signals in order to provide more accurate and relevant search results for the user. Though Google has not officially given this local search update a name, Search Engine Land has decided to title this update “Pigeon.”
The following are some of the most significant changes that webmasters have noticed from the Pigeon update:
Like other Google algorithm updates, it is difficult to understand the entire purpose of these changes during the early phases of the algorithm roll out. Here at The Search Engine Guys, we will be keeping an eye on the Pigeon update and will provide more information when it becomes available.
If you have questions about how this update may have affected your website or business in the search results, please contact us today at (512) 394-7234.
To contact the author, emails can be sent to: firstname.lastname@example.org
Yesterday, Google announced the launch of Google My Business. After years of trying to upgrade Places and merge it with Plus, Google has finally released a brand new platform that encompasses all of Google’s business tools, making it easier to manage your local business listings. While this doesn’t change much for our current Google+ pages, the new My Business interface provides clearer instructions on how to fill out, verify, and update a listing. With this rollout, it’s exciting to see that Google is proactively trying to solve the confusion between Places and Plus by creating a unified dashboard that makes life easier for small- to medium-sized businesses.
To contact the author, emails can be sent to: email@example.com
Matt Cutts, the head of search spam at Google, recently released a video in which he discussed a future where links weigh less into the ranking of a website. To save you 3 minutes, Google knows that as long as they’re dependent on links for the majority of their algorithm, it will always be easily manipulated. So it would make sense for them to try everything they can to get away from links as the primary factor in ranking.
Back in 2005, Google filed a patent called Agent Rank for a technique that would allow them to rank a piece of content based on the person who authored it. In theory, when content is added to the web, there would be a digital signature connecting the real life author to the database’s digital author profile. This signature could be unique, and attached to every piece of content that author puts on the web, creating a portfolio of sorts for each author.
An authority score given to different authors, or “Author rank,” could then be used in the future to give weight to new articles and content authored by the author. For instance, the author could launch a brand new website, and because their author rank is high across 20 other websites, Google would associate this author with other great content and potentially give more authority to that website, even though it is new.
Some websites, though, don’t include authorship info, and Google doesn’t have any good way to assign value to these pages outside of links. But that could change with Google’s acquisition of Deep Mind, home to the world’s leading researchers involved with artificial intelligence and deep learning. The term “deep learning” has come around since the mid-2000’s to describe a programming architecture in which it could make connections between different sets of data. It would make sense then, that it deep learning is most effective when it has large quantities of data to sort through and analyse.
Well, as of Jan 2014, Chrome has a dominating market share at 55% of all internet browsing, Google Analytics is on over 15 million websites, the Ad Network reached over 2 million websites, and Gmail is the leader in web mail, so it’s no secret that the amount of data that Google is able to collect is simply unfathomable. Up until now, sorting through so much data and drawing informed conclusions has been troublesome for computers. From TechCrunch,
World-renowned artificial intelligence expert and Google’s new Director of Engineering, Ray Kurzweil, wants to build a search engine so sophisticated that knows users better than they know themselves. “I envision in some years that the majority of search queries will be answered without you actually asking.”
But now, Deep Mind’s AI program will play a role in all of Google’s infrastructure, including search, advertising, and social. With the end goal to document and draw smart connections between the real world’s people, places, events, and things, we must assume that Google is going to be using their mass repositories of data to create individual user profiles for each of us – including authors and readers.
Users will see an increased level of relevancy in searches. For instance, if someone has emails in their inbox discussing the purchase of a new Honda Civic, and then they like Honda on Google+, and finally post pictures of their new Honda Civic to G+ with hashtags, then when that user goes to search for “change spark plugs”, Google will tailor the search results to include videos and tutorials specific to the user’s history, which is changing spark plugs on a Honda Civic.
Conversely, authors will see an increase in engagement from their users. Let’s say I’m a mechanic and operate a blog detailing simple maintenance on Hondas and Toyotas (Japanese cars). Google sees that I frequently discuss related topics to car maintenance, and that I mention Japanese name brands, not American. Hopefully the users who are searching for “how to change my car’s oil” and own Fords won’t see my blog, and by the same token, the time users spend on my site will likely increase because the content is more relevant to their lives.
In the future of the internet, where digital and real life become more integrated, it will be important that we associate ourselves and our businesses with others that are considered to be industry leaders, in hopes of being given credit and benefit of the doubt based on association. As such, we should all start building a digital reputation for ourselves now, so that we aren’t behind when the time comes.
Universal Analytics has been in Beta for quite some time now, but it has finally received the call up to replace Classic Analytics. Universal Analytics allows for more robust tracking and gives more insight regarding user interaction on a given website. There are plenty of changes and updates that are worth mentioning, but one of the biggest changes is the addition of the User ID.
While anonymous in nature, the User ID will allow us to track a particular user’s activity on-site and follow this individual across multiple domains, devices, and sessions. When a user visits one of our sites, they’ll be assigned an ID that will be unique to them and can be referenced if the user returns to the site or visits another one of our domains. This will shed more light on the transaction habits of a user before they convert on one of our clients’ sites, and answer a slew of questions that are important to making decisions for our marketing efforts, ie. Are individuals more likely to convert after a first visit, or does it take repeat visits before a user takes action? If people are more likely to convert on a second or third visit, how long does it typically take for the visitor to return to the site, and what sources are they coming from?
When you start to get a better idea of how people interact with your site, and get the full picture of how someone came to the decision to reach out to you or purchase your product, the choices you make on an advertising and marketing front come with a better foundation and hopefully, a greater likelihood of success. The addition of the User ID to Universal Analytics is a welcomed piece of data that should help us gain a clearer picture of how users interact with our websites, and of course, how we can help our clients successfully market their services.
If you have questions about using Google Analytics to improve your website or marketing efforts, contact us today at (512) 394-7234.
April 7th, 2014 – Posted by Andrew Cox to Google Analytics.
To contact the author, emails can be sent to: firstname.lastname@example.org
With the first quarter of 2014 behind us, we wanted to take a quick look at some of the biggest news on the SEO / Google front so far this year.
Google has kicked off this year with a whirlwind of changes and will only continue to push more updates in order to improve and evolve into a better and more user-friendly search engine.
If you have questions about Google updates and how they may have impacted your website, call us at (512) 394-7234 for more information.
To contact the author, emails can be sent to: email@example.com
Ever since Matt Cutts, Google’s head engineer in charge of web-spam released a video in May about what to expect in the next few months in terms of SEO, there have been noticeable fluctuations in the search results for many website owners.
The MozCast Google weather tracker is a tool designed by the highly regarded web-marketing company Moz. Recently, their “Mozcast” site displayed what many SEO experts consider to be alarming temperatures in the month of June. A “regular” temperature reading is usually somewhere between 50 to 80 degrees, but two readings this month have shot up over 100 degrees, breaking records in the weather chart. In fact, June 27, 2013 yielded a 120 degree reading, the highest ever seen in the history of MozCast. Overall, changes in the Google algorithm have shaken up the search results a few times this past month, including roll outs such as:
In March, Google mentioned that they will stop publicly announcing Panda updates, as the algorithm will continue to roll out monthly. Google is currently pushing out Panda updates over a 10 day period every 30 days.
What does this mean exactly? Google pushes the update on a specific day, so from that start day, the algorithm will continue to push out over a 10 day span. The push will continue to repeat itself every month.
Google has been working on refining the Panda algorithm to help sites that are lingering “on the border” of being impacted by the update. Cutts mentioned that they are “softening” the Panda algorithm by adding signals to search for quality metrics on these types of websites. It is unclear as to how soft these roll outs will be or how much change will take place in the search results.
Cutts also mentioned in this video (at 2:30) that there will be a new search update targeting “spammy queries.” Roughly a month after the video was posted, Cutts sent a tweet out confirming his statement in the video.
Certain industries, one being payday loans, are infamous for abusing Google’s algorithm by using automated software to build quick backlinks within a short amount of time. Of course, these websites do get caught after some time and lose their rankings once detected. Once the website gets flagged by Google for their unfavorable link building tactics, these companies then toss the old website, start over with a new website, and repeat. This “churn and burn” method is considered illegal and has been on Google’s action list to clean up.
This algorithm update is aimed to target link building and spam tactics globally. Other affected search terms include “car insurance” and pornographic related queries.
On June 25th, the MozCast reached 113 degrees, ousting the previous high of 102 degrees set on December 13, 2012. In this blog post by Dr. Peter J. Meyers from Moz, he discusses case studies monitoring de-personalized and de-localized queries.
Meyers conducted two different studies on the search terms “limousine service” and “auto auction.” Both cases showed similar patterns that indicate a partial-match domain update.
While the temperature in MozCast is experiencing high levels of fluctuation, Cutts also threw a Multi-Week update into the mix. Details on this update haven’t been confirmed yet, but there have been speculations as to what kind of update this will be. Will it be a follow up to the Payday Loan Algorithm or is this PMD update just a trigger to something of greater impact?
Cutts announced the rollout was happening on June 21, 2013 and will continue to affect the search results until the week after July 4, 2013. We’ll keep an eye on this while this update keeps rolling out through the month of July.
Google strives to improve the quality of search results for the user by keeping quality sites ranked high and devaluing sites that prove to be harmful or untrustworthy. To assure that Google users have the best matching results for their queries, algorithm pushes are necessary to keep spam and questionable websites off your results page.
There will be constant algorithm fluctuations this next month, considering the turbulence that occurred in the last week of June. If the pattern continues with the Multi-Week update, we could be approaching a few more stormy SEO days. Look out for:
If your site has lost placement recently or has been moving around in the SERPs, the high number of algorithm changes that rolled out in June may be the reason. While we expect things to start settling down soon, the effects of these updates are likely to continue for a little while longer.
To contact the author, emails can be sent to: firstname.lastname@example.org
Over the years, Google has continually updated the layout of search results. On June 18th, Google announced they were rolling out an interactive carousel of local results “for local dining, nightlife, hotels, and other attractions on desktop” that will be featured at the top of the page. Below is an example of the new Local Carousel:
On the day of the announcement, Google+ Local community manager Jade Wang offered advice for businesses in a post on the Google and Your Business Forum:
How can I get my business to show up in the carousel?
While we can’t guarantee inclusion in search results, we can say that the carousel will show results from listings in Google Maps using categories. Just as in regular ranking, Google’s algorithms take into account many factors to select the places and results that are most relevant to the user. This algorithm based approach is also used to decide which businesses are in the carousel.
Why is this feature only available for some business verticals?
We’re committed to providing users a high quality search experience for every query. The carousel filtering experience is a good fit for some categories of local businesses. We will continue to experiment with different designs and interfaces to make sure that users get the information they’re looking for, fast.
I’d like to see this feature in more languages and countries, please!
We’ll work as fast as we can to roll out new features in as many places as possible, but have nothing to announce at this time.
My business is on the carousel, but I’d like to change the photo. How can I do that?
The Google business listing is one of several sources we use for the photos in the carousel, and making sure high-quality images are posted to it will help improve your photo. However the image selection, like the actual ranking of businesses, is primarily decided by algorithms and so we can’t guarantee complete control over the image.
While the answers to the questions listed above offer some great insight, you may still be asking yourself the following questions:
At The Search Engine Guys, these are all important questions we discuss when building our local search strategy. We have dedicated our time to studying the history of changes regarding local search and keep ourselves informed regarding the most recent updates.
In comparison to the previous local results on Google, the new carousel displays results horizontally versus vertically. There must be at least 5 local results for the carousel to be displayed. If there are any fewer, it will show the original 1, 2, 3, or 4 pack display. It was noted by local search expert Mike Blumenthal that “the new Local Carousel will show up to 20 results if there are that many in any given market.” Depending on the size of your screen, the number of results that will display before using the scrolling feature will vary. Each business result will feature a photo, the business name, and rating and review information. The photo below shows how the Local Carousel aligns with the old 7-pack result set:
I am interested in seeing if the clicks for local results will be more evenly distributed now that the results are showing horizontally versus vertically. In an article posted on Search Engine Land, they discuss a study conducted by search marketer Matthew Hunt who used heat maps to gauge interest levels for results. It was discovered that 48% of searchers click the carousel while only 14.5% of users clicked the map. I also look forward to seeing if the photo selected by the Google algorithm for local listings has an impact on click through rates.
Now more than ever, it is important to place locally on Google. Because the carousel features an image for each business, it is important that all of the photos you have posted are high quality and representative of your business. Reviews will also become more important because of Google’s announcement discussing the return of the 5 star review system. Once a business has received 5 reviews, the stars will be highlighted in gold , which could be a determining factor for piquing user interest.
It is hard to be certain what changes or updates will be made in the future, but it’s important to always be prepared. Will Google slowly roll out the Local Carousel to other categories, such as attorneys? We aren’t quite sure, but we are adjusting our local search strategy in case it does. Will the local carousel extend to mobile search results, where the local results predominantly show before the organic results? If so, it will be crucial to gain local placement. Here at The Search Engine Guys, we begin researching updates as soon as they are announced and discuss what potential changes could happen in the future in order to be better prepared. We are anxiously waiting to see what Google will do next.
Since the idea of the internet and websites first caught on around 1989, marketers and designers have been studying intently to find the best way to leverage consumer interaction. Just like in traditional marketing, a great deal of research is conducted with the intent of discovering how web users engage with the information on any given website. When The Search Engine Guys take on a new client for design and optimization, one thing we try to keep in mind is the way people naturally scan any given page for information that they’re looking for. Luckily, we don’t need to spend much time researching this user interaction because there are plenty of other groups interested in this kind of data, and they have made it readily available to anyone who searches for it. I want to provide a little bit of insight into some of the more popular studies on eye-tracking and how we use these data in our designs.
The idea of tracking eye movements and creating diagrams dates back to the late 1800’s when a French ophthalmologist noticed his test subjects were reading in a series of short stops and quick movements, as opposed to a long, smooth sweep. Here is an early diagram of fixations and saccades, the quick movements from point to point:
This observation was further explored in the 1900’s, first with primitive contacts that had aluminum pointers, and later by reflecting beams of light off of the subjects’ eyes and onto a film. It was later observed that eye movements are largely dependent on the task given to the user. To quote Alfred L. Yarbus,
“Records of eye movements show that the observer’s attention is usually held only by certain elements of the picture…. Eye movement reflects the human thought processes; so the observer’s thought may be followed to some extent from records of eye movement (the thought accompanying the examination of the particular object). It is easy to determine from these records which elements attract the observer’s eye (and, consequently, his thought), in what order, and how often.”
In the 1980’s we saw the advent of real-time eye tracking using computers. This allowed for a much more accurate depiction of how the user interacts with any given image or text. The pieces were finally coming together to lay the foundation for eye tracking on web pages.
Well, not in the scary big brother sense. In 2009, Microsoft sponsored a popular study titled, What Do You See When You’re Surfing? Using Eye Tracking to Predict Salient Regions of Web Pages. The premise of the study was to gain “an understanding of how people allocate their visual attention when viewing Web pages”. While there had been similar studies in the past, the researchers point out that these studies were generally ambiguous, only identifying scan paths as opposed to fixation time, or using only three different sample pages for test subjects. Leveraging an eye-tracker built by Tobii Technology, Microsoft presented 361 web pages to 20 test subjects. With the data they collected, Microsoft was able to describe the general flow of eye movements, which provides us with invaluable information about user interaction. A few notable facts:
The study suggested that those who use the internet once or more a day spend less time actually reading the content, and scan pages faster than those who do not use the web as often. Other findings include:
The most linked-to research done on eye-tracking was conducted by the Nielson Norman Group, or NN/g. The study is a 355 page report based on usage data from over 300 users looking at hundreds of different websites. The findings revealed many important insights.
We found that users’ main reading behavior was fairly consistent across many different sites and tasks. This dominant reading pattern looks somewhat like an F and has the following three components:
- Users first read in a horizontal movement, usually across the upper part of the content area. This initial element forms the F’s top bar.
- Next, users move down the page a bit and then read across in a second horizontal movement that typically covers a shorter area than the previous movement. This additional element forms the F’s lower bar.
- Finally, users scan the content’s left side in a vertical movement. Sometimes this is a fairly slow and systematic scan that appears as a solid stripe on an eyetracking heatmap. Other times users move faster, creating a spottier heatmap. This last element forms the F’s stem.
F-shaped Heat Maps
NN/g also reported several interesting bits about different reasons users utilize search functions, how they analyze those search results, and how they choose which to choose:
One final point worth mentioning from this study is the classification of scanning behaviors, and why people may read more or less content on your website.
- Exhaustive review: People look extensively and repeatedly at an area or page because they expect the information they want to be there, but they cannot find it.
- Directed scanning: A person looks for specific information such as a name or word and expects to find it on the page.
- Motivated scanning: Scan patterns fueled by good page layout, interesting content, personal interest, or a trusted suggestion.
- Impressionable scanning: A person is more open to reading the words as the author has written them.
Here at The Search Engine Guys, we pay attention to details like these to make sure that our clients’ websites are optimized to capture the attention of the user. Our goal is to make sure the user has quick access to whatever information they may have been looking for. Contact and brand information is seen quickly, followed by easy navigation to deeper areas of information. Thinking about it, it makes sense that web designers use this kind of knowledge to help map the flow of a website. Being able to leverage the instincts of a user means higher conversion rates and in turn, more leads, and that’s good news for everyone.
To contact the author, emails can be sent to: email@example.com
Last week, Cloud  Sixteen, Inc. was excited to be invited to be a part of the M&L Legal Marketing and Management Seminar in Palm Beach, Aruba. The conference was founded in 1992 by J. Marshall Hughes and Lee Coleman, and they have continued to host this event bi-annually since that time. This has allowed Legal marketers to bring their knowledge and expertise to the Law Firms in attendance.
This year, Cloud  Sixteen, Inc. CEO Joe Devine was invited to speak at the conference, held from January 11th – 16th. Some notable sponsors for the event included Plaintiff Investment Funding, Innovative Legal Marketing, Lien Resolution Services, LLC., as well as other established marketing firms. We were proud to be a part of this great event, and hope that we can continue to participate in the years to come. Thank you to M & L Legal Marketing and Management for putting on such a great event, and we look forward to the next seminar in summer 2012!