Thursday, 15 October 2015

Professional Local SEO Service: Why You Can’t Ignore It

You might be puzzled a bit hearing the term ‘local SEO’. What is it? And do you really need it? If these questions are taking shape in your mind, you are not alone. Many people, even if they have an overview of SEO services, don’t have clear answers to these aforesaid questions. However, understanding local SEO is not that much difficult.
Basically, Local SEO is the optimization process for local search engines. Most small businesses depend on local market. They cannot survive if they don’t get customers from their local market. This is where local SEO steps in. Professional local SEO services enhance the visibility of business in a local search engines. NAP (Name, Address, and Phone number) of a business and customer reviews are some of the many local ranking factors, which impact local rankings. Below mentioned are six main local SEO strategies:
       
  • Creating local business pages for leading search engines
  • Optimizing and properly categorizing local business pages
  • Accurate business citation
  • Online business reviews
  • Top quality images
  • Optimization of business website

In addition to these strategies, a professional SEO company may devise other action points based on the needs of a particular business. Just like regular SEO, local SEO strategies differ for each business domain. Local SEO, if done properly, certainly increases sales. 

EZ Marketing Tech is a leading local SEO expert in Beverly Hills.  We have helped many small businesses create a strong local presence on the web. Request a callback to know how we can help you.

Tuesday, 13 October 2015

Google Places – A Better Way to Target Local Audience

As searches on the internet become increasingly local in intent, businesses need to think of ways of turning them to their favor. Whether on computer or mobile devices, local searches is now a common phenomenon these days. Keeping this trend in mind, it is important for businesses to incorporate a strategy that helps them beat their local competitors and enables them to be the preferred choice of local people in need of their products or services. But, how can one achieve this? Working with a Google places optimization services company is the way to go about it.

So, what is Google Places and how can it be fruitful? Google places or Google Maps, in simple terms, is a way to market your business on a local level. For people serving Beverly Hills, they need Google places optimization services to get the attention of potential customers as well as search engines for this particular area.

Even if you aren't faring well in terms of organic results, you can still make amends and get yourself higher up the local chart with the help of Google places. The advantages don't end here. As Google maps represent you as a physical entity, the chances that people will buy from you are higher.

Tuesday, 18 August 2015

Google Says: Now You Can Use Pipes in Title Tags

In the past, Google has said not to use pipes in URLs because it is one of the characters than can imply other things. But using them in your title tags is fine and we see it often.

Google's Gary Illyes said on Twitter "doesn't matter" if you use pipes or not in your title tags.


But from 2010, John Mueller of Google wrote:

“We'll generally crawl and index any accessible and valid URL, I'm sure you'll find many URLs with "pipe" characters in it in our index. That said, just because it's possible doesn't necessarily make it a good idea :-). Similar to using spaces in URLs, those characters may cause issues elsewhere, so personally, I'd try to minimize the risk of problems anywhere by avoiding those kinds of URLs.”

If you don't have the time, nor the slightest idea how optimize title tags then you can hire a trusted SEO Company do it for you.

Google Says: 404s Don't Impact the Google Panda 4.2 Algorithm


Google's Gary Illyes responded in short to a Google Panda question, asking if having 404ed pages have an impact on the overall Google Panda algorithm. Gary said on Twitter, "nope," it does not.



The truth is, Gary answered this assuming the webmaster meant a negative impact. Does having 404s trigger Panda in a bad way? The obvious answer to that is no.

But many sites hit by Panda do try to 404 and remove a ton of pages on their site to recover from a Panda penalty.

Google has recommended removing content from sites impacted by Panda.

If you don't have the time, nor have the knowledge how to revoke Google Panda penalty, allow an Internet Marketing Company do it for you.

Friday, 31 July 2015

Google Says Links Within PDF Documents Pass PageRank


Google's Gary Illyes said in the comments section of this Google+ post that links within Adobe PDF documents do pass PageRank.

Also, Google said so in this blog post from 2011 "generally links in PDF files are treated similar to links in HTML: they can pass PageRank and other indexing signals, and we may follow them after we have crawled the PDF file.”

Links in .pdf documents will be displayed in your Google Webmaster Tools Backlinks, they will accumulate PageRank and even pass this on.

It is a good idea to place links into .pdf documents that you give away on the web not only for PageRank reasons but also to give users an easy link to visit your site for more information.  Think about usability when you create .pdf documents in the same ways that you think about usability for your website.

Also, if you complete the "properties" attributes of .pdf documents you can give them a title that will appear in the SERPs like a title tag on an .html webpage.

Finally in addition to .pdf documents you can also get viable backlinks and clickthroughs from .ppt (PowerPoint) .xls (Excel) and other types of files.  Consider allowing other webmasters to include them on their site.  That way they can bring you links from other domains.

Also, you can hire an affordable SEO company to build high quality backlinks for your website to improve the visibility of your website in SERP.

Google saw 180% Increase In Hacked Sites In The Past Year

Google launched their #NoHacked campaign again and shared some really sad news. In the past year, Google saw a 180% increase in hacked sites. That is just insane.

Google wrote on their blog, "over the past year Google has noticed a 180% increase in the number of sites getting hacked."

So over the next month or so, Google is going to try to educate webmasters on hacked sites. How to prevent it, how to fix it, how to stay secure, etc.

Google is going to provide:

* hacking insights on their blog every Monday
* actionable tips on their social channels every Wednesday
* a security-themed Hangout on Air over here

180% - just insane.

Read more about How to avoid being the target of hackerson official Google’s blog here: http://googlewebmastercentral.blogspot.com/2015/07/nohacked-how-to-avoid-being-target-of.html

If your site has been hacked then you can also consult with a trusted and reputed SEO Services Provider Company.

New Google Warning: GoogleBot Cannot Access CSS & JS

Google is now sending out a rush of new warnings via Google Search Console, formerly Google Webmaster Tools to notify webmasters that GoogleBot cannot access their CSS and JS (JavaScript) files on their web sites.

Don’t be alarmed if you received a warning from Google in your email today — many webmasters were alerted that “Googlebot cannot access your JavaScript and/or CSS files.”

Google sent out this warning via Search Console, while also reminding them that Googlebot’s inability to access those files may result in “suboptimal rankings”.
That sounds bad, but the good news is there’s an easy fix for it and implementing the fix end up helping your site.

Tons of webmasters are concerned after receiving these warnings.

Here is a picture:


Google has been telling webmasters to not block CSS & JavaScript for years and years. Here is Matt Cutts in 2012 telling webmasters not to block it. The webmaster guidelines were updated to say not to block them. The new fetch and render tool warns you when you block CSS and JavaScript. We also know, Google renders the page as a user would see it these days, so blocking CSS/JS can impact that big time.

It seems like Google is sending these notices out in mass quantity now. The message reads:

“Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.”

This is not a penalty notification, but a warning that if Google cannot see your whole site, it may result in poorer rankings.

If you get this message, you may consult with a trusted SEO services company. Also you can use the fetch and render tool to diagnosis the issue deeper as well.

How To Quickly Unblock JavaScript & CSS Assets

Gary Illyes from Google posted on Stack Overflow the cheat, or quick way, of unblocking your JavaScript and CSS files from Google. Gary said the "simplest form of allow rule to allow crawling javascript and css resources" is to add this to your robots.txt file:

User-Agent: Googlebot
Allow: .js
Allow: .css

Gary said this will open it all up for GoogleBot.