5 Steps to Creating an Amazing LinkedIn Company Page

Comments Off on 5 Steps to Creating an Amazing LinkedIn Company Page
June 4  |  Social Bookmarking, Social Media Sites, Social Networks  |   Alex Chan

5 Steps to Creating an Amazing LinkedIn Company Page

Everyone knows that social media is latest platform for promoting their products but where do we go beyond Facebook and Twitter? Facebook lets us reach friends of friends and Twitter broadcasts to news addicts so what about everyone else? In the past year, LinkedIn has reshaped their site to present a new opportunity to reach professionals, businesses, and companies. They actually have and send traffic now. Creating a LinkedIn profile for your company is a step forward to building your company’s image and increasing its popularity.

Making a LinkedIn profile for your company is very easy and more important, it’s free! Here are a few tips on how to create a good LinkedIn profile for your company:

1. Don’t skip the “About Us” section
Write a detailed summary of your company under the “About Us” section at the bottom. Put effort into writing a good description for everyone to see. Tell people what’s important about your company. Upload a crisp and clean cover image. The cover image is much larger than your profile image so viewers will see that first.

2. Use the Products & Services tab
All social networks are a great way of promoting your products and services. With the Products and Services tab, you can put images of products and even provide links so people can buy them. LinkedIn gives you the opportunity to place your most prominent product on the top of the list so people can see it first.

3. Connect with your employees
After you create a profile for your company, ask all your employees to connect to the profile and say they work for your company. This increases the circle of people that can connect to your company, but also provides visitors a possibility to connect directly to the employees. It makes your company accessible and open to the general public.

4. Recruit people
The LinkedIn profile can also be used as a recruitment tool. The Careers Page is used to recruit talented people and make contacts with other professionals. If you wish, you can pay to have the Career page available on your LinkedIn profile.

5. Generate good content
Last but not least, always make sure you keep your profile page updated and interesting to users. Always generate good quality content on your LinkedIn page. Customers like to be constantly updated with the new information whether it is about your company or in general. Generating good content is the hook that attracts more followers. If you have nothing to say, then nobody would follow you.

Tagged , , , ,

Footer Links Targeted By Google?

Comments Off on Footer Links Targeted By Google?
June 3  |  Link Building  |   Ryan Clark

I recently got into a discussion with a crew of marketers and “SEO” folks I meet with for beers and the topic of footer links was something that kind of had me stumped. With all the confusion and weirdness surrounding links now thanks to Google and whatnot, it’s hard to have the right opinion on without knowing the right facts.

What we do know is that Google isn’t a big fan of of them and I suspect for the most part they will be devalued, especially if site wide. Taking a look at Google’s link schemes guide it clearly states they have distaste for them, but as usual there is little said which leaves a lot to the imagination.

footer links google penalty
 
Flash back to 5-10 years ago and site-wide footer and sidebar links were the equivalent of gold in the online marketing landscape. They worked so well that Google had to crack down hard although they seemed to put a lot of webmasters off of it, blackhatters were and are still buying them up like crack rock. Hell, I’ve even jumped on board with a client and saw their marketing teams doing this without even an inkling that it was against Google’s TOS.

So back to my beer meeting. One of us who wishes to remain anonymous asked us if he should be nofollowing his web design credit links on client sites. I know this is a topic that has been talked about before(here, here, here and here), but I still wasn’t too sure especially after reading those links.

For the most part I think Google will just simply devalue those links. On the other hand, if you’re gaming anchor text then I fully believe trouble will come. I also think there could be a problem(although there’s no way to tell) if you give a client a discount in exchange for that credit link. That alone is enough to get me worried so I thought this topic could be touched on yet again as well leave it open for others to comment on as time goes by.

Matt Cutts On Footer Links

 
While that video is a little old it gives us a little more insight despite Matt being vague as f%$! as per usual. I understand he has to word it like that as things change over time and who knows what Google’s algorithm will be doing in a year from now. My personal opinion is that if most of your links from from the footer area as a credit link, you may want to reconsider your linking efforts asap.

Two Real Examples Of Footer Link Problems

Looking in the past year I wanted to mention a couple case studies that surrounded the footer link issue. Since they’re quite recent, I hope to push the discussion further. I especially get confused with links from web design companies that usually put a “designed by” credit link in client footers.

The WPMU Case: http://moz.com/blog/how-wpmuorg-recovered-from-the-penguin-update

This is an interesting case study of sorts thanks to Ross Hudgens who took the time to analyze and report on how it played out. I specifically was glued to this case because the links effecting WPMU certainly were, for the most part, in their control. The other aspect I found quite interesting was the majority of the links were not focused on a “money” anchor text.

I always like to think that Penguin or whatever algorithm update will also harshly look at how many of x type links a website has. With WPMU getting the bulk of their links from their themes and plugins with credit links we can also assume the link placements are the same. Since the Penguin update is a computer and not a human, you still have to be careful even if you’re a big brand producing great work.

1338270799_b9ff4230eba8d5893007924be9144284

Personally, based on how quick their penalty was lifted, I suspect this was revoked manually by someone at Google. It is rare to get your case made public in the media and I would have done some damage control if I were working for the big G. WPMU also had a lot of the footer links in their control as you can see from the post. The point I want to hammer out is of course the problem with their “footer” links.

The Jit Bit Case: https://news.ycombinator.com/item?id=5792268

This was a very recent discussion which involved Matt Cutts himself piping in quite a bit to their issue. Jit Bit creates really awesome software which does have “powered by” credit links on the sites running their gear. Here’s the bit that caught my attention in their thread over at Hacker News.

Our site WAS affected by Penguin indeed, even by the first version of Penguin a year ago. Because we sell web-forum software and ticket-software – that both have a “powered by” link at the bottom, our SEO agency advised to add that…
And we’re still trying to recover… I’m contacting our clients one-by-one and we’re changing those links to “nofollow”.

Sadly another great company gets inept advice from a “SEO” company and they’re left cleaning up the mess on their own dime. It is extremely important to vet your SEO company before taking them on, and yes this even goes to our clients.

Further down in the comments is this other tid bit from the Jit Bit team.

Which ones? The ones above? I’m not sure. Will have to contact them and get back.
I think, the links we’re being penalised for – are mostly the links that come from our software widget. Check out this page, the very-very top of it: http://algonac.thebestcityguides.com/Forum/forum4195-Minneol…
We have HUNDREDS OF THOUSANDS links like this (I’mm looking at my WMT right now). I guess this is the main reason. Our site is hit by Penguin…

So if you’re implementing a strategy like this I’d be very careful as it could really effect your business. Footer links can get out of control and make for a stressful removal process some time down the road.
 

Tagged , ,

What to Expect From Google’s SEO Department in 2013

Comments Off on What to Expect From Google’s SEO Department in 2013
May 21  |  Link Cleanup, News, Video SEO, Wordpress SEO  |   Alex Chan

If you’re wondering what’s in store from Google for 2013, the video above shows Matt Cutts giving an elaborate explanation of what to expect.

Coming soon is the newest version of the Penguin algorithm, called Penguin 2.0. This update will have serious improvements from the last Penguin 1.0, and we should expect it any day now. The new Penguin will detect black hat SEO, go deeper into the websites, and have a larger impact than the previous updates. Sites should get ready by using Google’s Disavow Tool to defer any spammy links.

Along with Penguin, a new Panda update is coming. This update is expected to have an enormous impact on many websites and Google will try to “lower” the affects to keep from harming websites. The last major Panda update had serious impact on all websites. We should expect something similar for this one. The fact that Google states they need to “tone it down” means it’s going to be HUGE.

After punishing Interflora and a few UK Newspapers, Google will start looking into advertorials and other types of advertising that directly violate Google’s guidelines. Paying money for advertising in order to pass the PageRank is one of the issues that Google has been regularly monitoring this year. A number of link networks have already been shut down or heavily penalized.

Many users in England complained about pay day loans and other pornographic queries that appeared on Google.co.uk. Google will introduce two new changes in the area of spammy queries. One method is to detect links “upstream” in an effort to reduce the value of the spam links. Along with monitoring links in a new way, Google plans develop advanced link analysis software that will make Google’s search engine understand the linking flow better.

When certain websites present an authority in a certain field like travel, medicine, real estate, and so on, Google will try and boost the authority of those pages allowing them to top on the search result pages. This is great for existing authoritative websites but it will make it harder for new websites to join the race. In addition to handing out higher authority, Google will make another attempt at cleaning the “cluster issue”. The cluster issue was presented back in 2012 for websites that dominate the first page for a single keyword. That includes websites like yelp, and possibly even Google Maps, YouTube, or other Google entities. This is an attempt to diversify the results and offer the end-user a wider range of information.

Last but not least, Google hopes to improve the communication with the webmasters. This is great for white hat developers and we’ve already seen some improvements after Matt Cutts announced the best method for handling a manual Google penalty. When dealing with webmasters, Google announced that they will be providing more detailed and explicit information to the webmasters in their Webmaster Tools. That was something that Google was always criticized about and anyone who’s dealt with this can relate.

By the sounds of things, Google has a few good ideas that are going to help webmasters on the way. The improved communication sounds great but everyone should be prepped for both Penguin 2.0 and the upcoming version of Panda.

Penguin 2.0 And Beyond In 2013

Comments Off on Penguin 2.0 And Beyond In 2013
May 14  |  Link Building  |   Ryan Clark

 
Here’s the latest video from Matt Cutts talking about what to expect in regards to how Google will treat links in the coming months. The war on link spammers rages on and Google at this point in time is still heavily gamed and the results for any money making keyword is full of junk.

Pay close attention to what he says as this is an unusually long video for this series and it is full of insight. He covers not only Penguin 2.0, but all the other link changes coming to the table including;

  • What Penguin 2.0 will go after
  • How Google will handle hacked sites
  • How Google will go after tiered link building
  • How Google will better choose authorities in a niche..AuthorRank anyone?

I’m not too worried about these updates as we never target anchor text, nor do we actually build links for clients apposed to earning them. It’s a different world but if you’re wanting to run a real brand, you cannot be out there chasing junk links just to pick off those coveted keywords.

Here’s a transcript of the video for those who cannot watch it:

Opening and Disclaimers

Hey everybody, today’s webmaster video is answering the question: “What should we expect in the next few months in terms of SEO for Google?” Okay, so, first off, we’re taping this video in early May of 2013, so I’ll give you a little bit of an idea about what to expect as far as what Google’s working on in terms of the webspam team. In terms of what you should be working on, we try to make sure that is pretty constant and uniform. Try to make sure you make a great site that users love, that they’ll want to tell their friends about, bookmark, come back to, visit over and over again, ya know, all the things that make a site compelling. We try to make sure that if that’s your goal, we’re aligned with that goal, and therefore, as long as you’re working hard for users, we’re working hard to try to show your high quality content to users as well. But at the same time, people are always curious about, OK, what should we expect coming down the pipe in terms of what kinds of things Google’s working on. One of the reasons that we don’t usually talk that much about the kinds of things we’re working on is that the plans can change. Ya know, the timing can change, when we launch things can change. So take this with a grain of salt. This is, as of today, the things that look like they’ve gotten some approval or they look pretty promising. Okay, with all those kinds of disclaimers, let’s talk a little bit about the sort of stuff that we’re working on.

Intro to Penguin 2.0

We’re relatively close to deploying the next generation of Penguin. Internally, we call it “Penguin 2.0″. And again, Penguin is a webspam change that’s dedicated to try to find blackhat webspam and try to target and address that. So this one is a little more comprehensive than Penguin 1.0 and we expect it to go a little bit deeper and have a little bit more of an impact than the original version of Penguin.

Paid Ads/Coverage/Links

We’ve also been looking at advertorials that is sort of native advertising and those sorts of things that violate our quality guidelines. So again, if someone pays for coverage or pays for an ad or something like that, those ads should not flow PageRank. We’ve seen a few sites in the US and around the world that take money and then do link to websites and pass PageRank. So we’ll be looking at some efforts to be a little bit stronger on our enforcement as far as advertorials that violate our quality guidelines. Now there’s nothing wrong inherently with advertorials or native advertising, but they should not flow PageRank and there should be clear and conspicuous disclosure so that users realize that something is paid, not organic or editorial.

Examples of Specific Niches to Be Targeted

It’s kind of interesting. We get a lot of great feedback from outside of Google. For example, there were people complaining about searches like “payday loans” on Google.co.uk. So we have two different changes that try to tackle those kinds of queries in a couple different ways. We can’t get into too much detail about exactly how they work, but I’m kind of excited that we’re going from having just general queries be a little more cleaned to going to some of these areas that have traditionally been a little more spammy including, for example, some more pornographic queries. And some of these changes might have a little bit more of an impact in those kinds of areas that are a little more contested by various spammers and that sort of thing.

Going Upstream and More Sophisticated Link Analysis

We’re also looking at some ways to go upstream to deny the value to link spammers–some people who spam links in various ways. We’ve got some nice ideas on trying to make sure that that becomes less effective and so we expect that that will roll out over the next few months as well. And in fact, we’re working on a completely different system that does more sophisticated link analysis. We’re still in the early days for that, but it’s pretty exciting. We’ve got some data now that we’re ready to start munging and see how good it looks and we’ll see whether that bears fruit or not.

Hacked Sites

We also continue to work on hacked sites in a couple different ways, number one trying to detect them better, we hope in the next few months to roll out a next generation of hacked sites detection that is even more comprehensive, and also try to communicate better to webmasters, because sometimes they/we see confusion between hacked sites and sites that serve up malware, and ideally you have a one stop shop where once someone realizes that they have been hacked, they can go to webmaster tools and have some single spot they can go where they get a lot more info to sort of point them in the right way to hopefully clean up those hacked sites.

Shout out to the Spam Lords

So if you’re doing high quality content whenever you’re doing SEO this shouldn’t be some big surprise you shouldn’t have to worry about a lot of different changes. If you’ve been hanging out on a lot black hat forums and trading different types of spamming package tips and that sort of stuff then it might be a more eventful summer for you.

Authority Sites

But we have also been working on a lot of ways to help regular webmasters. We’re doing a better job of detecting when someone is sort of an authority in a specific space, could be medical or could be travel or whatever, and trying to make sure that those rank a little more highly if you’re some sort of authority or a site that according to the algorithms we think might be a little more appropriate for users.

“Borderline Quality” Sites… Possibly Good for Sites that Were Pandalized

We’ve also been looking at Panda and seeing if we can find some additional signals and we think we’ve got some to help refine things for the sites that are kinda in the borderzone/in the grey area a little bit. So if we can soften the effect a little bit for those sites that we believe have some additional signals of quality that will help sites that might have previously been effected to some degree by Panda.

Ranking Multiple Pages of Same Domain for the Same Query

We’ve also heard a lot of feedback from some people about that if I go down three pages deep I’ll see a cluster of several results all from one domain. We’ve actually made things better that you’re less likely to see that on the first page and more likely to see that on the following pages. And we’re looking at a change which might deploy which would basically say that once you’ve seen a cluster of results from one site then you’d be less likely to see more results from that site as you go deeper into the next pages of Google search results. And that has been good feedback that people have been sending us. We continue to refine host clustering and host crowding and all those sorts of the things. But we’ll continue to listen to feedback and see what we can do even better.

Back to Hacked Sites

And then we’re going to keep try figuring out how to get more information to webmasters. I mentioned more information for sites that are hacked and ways they might be able to do things, we’re also going to be looking for ways we can provide more concrete details, more example URLs that webmasters can use to figure out where to go diagnose their site.

Conclusion

That’s just a rough snapshot of how things look right now, things can absolutely change and be in flux we might see new attacks, we might need to move our resources around, but that’s a little about bit of what to expect
over the next few months in the summer of 2013.
I think it’s going to be a lot of fun. I’m really excited about a lot of these changes
because we do see really good improvements in terms of people who are link spamming or doing various black hat spam would be less likely to show up I think by the end of the summer. And at the same time we’ve got a lot of nice changes queued up that hopefully will help small/medium businesses and regular webmasters as well. So that’s just a very quick idea about what to expect in terms of SEO for the next few months as far as Google.

Tagged , , , ,

Fiverr Links Cause Google Penalty – DUH

Comments Off on Fiverr Links Cause Google Penalty – DUH
May 12  |  Link Building  |   Ryan Clark

It amazes me to see people buying up Fiverr link building packages these days not expecting to tank their legitimate businesses. Either that or us SEO’s and Google are doing a terrible job in educating the average business owner on what’s good and what’s not. Buying thousands of junk links to your site is not only a waste of time and money, it’s a potential businesses ender. I’ve dealt with many companies that have been duped and literally destroyed their entire business because of it. Not cool!

As some of you may know, I spend a lot of time helping and or reading what other people are experiencing on Google’s Webmaster Help Forums. The amount of times I’ve had to bite my tongue and explain to clients why buying $50 worth of links versus $xxxx’s on great content marketing is getting high up there. Real brands should be earning links and if you can’t do that your not going to ever become a niche leader.

Fiverr links have killed my site, now I’m working to fix old mistakes. Please help!

fiverr links google penalty
Read The Thread -> http://productforums.google.com/forum/#!category-topic/webmasters/crawling-indexing–ranking/ZiXmtqjIjhE
 
Ignore the usual rudeness of the inept moderators in the thread as they’re usually snobby little %$##!@%&s. This website owner is in for hundreds of hours of work before their site will recover. To get out of a manual penalty you’ll have to do a lot of work removing about 90% of those junk links. You will have to document everything you do to present to the almighty Google Gods, and that for the most part doesn’t work too often. Their Disavow Tool is also a last resort as they want you to try and remove all the spam links you had built…..yikes!

There is however a lot of great information for people in the same boat and reading these threads I find is a good start. Dealing with a manual Google penalty is extremely frustrating and I’m glad we don’t do it as a service. I would however like to hear from other people in the comments on their experience and if you have any questions don’t hesitate to ask away.

Tagged ,

User Generated Content Can Get You Penalized

Comments Off on User Generated Content Can Get You Penalized
April 29  |  Link Building, Reputation Management  |   Alex Chan

User Generated Content Can Get You Penalized

Mozilla was manually penalized by Google this week for having tons of user generated spam on its website. User-generated spam usually appears on websites that have forums, guestbook pages etc. But in this case, it turned out it was not entirely Mozilla’s fault. Prior to a few days ago, searching for “site:mozilla.org cheap payday seo” would have yielding pages of spammy forum posts.

After receiving a notification from Google stating that it has applied manual spam action against them, Chris More, Mozilla’s Web Production Manager, immediately started repairing the problem. It turned out to be difficult as he could not find the reason for which Mozilla was being penalized. He stated that he could not find nor detect any spam content on www.mozilla.org. Since Google was not willing to be more precise regarding the spam, Mozilla could not remove the spam in question.

Google has often discussed the idea of being more transparent when handling spam actions, but that has not happened yet. Such idea was quickly turned down since Google does not want to give spammers pointers where they did wrong. This creates a very big problem for domains since they get penalized for something they did not know existed in the first place.

Manual penalties, unlike algorithmic penalties, are easier to handle. You detect the spam on the website, delete it, and send a reconsideration request to Google about your actions. But the problem with undetectable user-generated spam is that people usually do not know what to fix and what to send a reconsideration request for. Here’s the catch! By sending a reconsideration request, a website can ask Google to point out the user-generated spam. This procedure might take a couple of weeks, but it will rectify all the problems that have occurred on the way.

So to all webmasters, carefully monitoring your websites and police your user generated content. That seems to be the best defense against all the spammers out there.

Tips To Grow Your Newsletter Email List Fast

Comments Off on Tips To Grow Your Newsletter Email List Fast
April 22  |  Brand Building  |   Alex Chan

Grow-Email-List

Many companies and marketers have stated that one of their biggest goals is to increase their mailing lists. These days, most of them are trying to engage their customers, and interact with them. Quality lists have overtaken over lists built based on quantity.

So what important steps must companies take in order to get a larger number of mailing addresses from their customers that are real and not just made up at the moment?

The first and most important thing for companies and marketers are their websites. The design of the website can genuinely attack or appall customers in the very beginning. User friendly sites, where customers can easily browse and look through pages are highly recommended. The Opt-In option must be easily detectable and easy to fill in. Most companies tend to focus on the social networks and forget to optimize the opt-in feature. Remember, this option will also provide you with emails from potential or existing customers. The Opt-In form should be placed as many times as possible on your website – at the top, at the bottom, on the right and on the left side of the page, always visible to the user. When creating the forms, try and make them simple. Nobody wants to fill in long and boring forms that make people feel like they are being interrogated. Make it simple and precise!

Other ways of collecting emails is through direct contact with your customers. Set up collecting points at your stores. Ask customers to provide you with their email and mobile phones and in return, offer them some benefits. The best way for them to give you their emails and phone numbers is to write them. Touchpads are the most efficient option for this. Ask customers to type in the information. In that way you avoid any misspellings or illegible handwriting. Ask permissions before you start sending SMS and emails to the customers.

And last but not least, don’t forget the social pages. Your website must enable users to register through their social accounts; otherwise many will not bother registering at all. Your site must be equipped with Facebook, Twitter, Google+ social sign-in options. The chances that the customers will find you through the social pages are big, so use the potentials these social pages have to offer you and benefit from them.

Reasons Why New Pages Rank Well Then Drop

Comments Off on Reasons Why New Pages Rank Well Then Drop
April 15  |  Video SEO  |   Alex Chan

One of the most common occurrences with new web pages is that they reach a very high ranking on Google and in the first couple weeks of publishing, then experiences a slow ranking drop.

As this is a continuing trend for newly developed pages that have quality content, Matt Cutts has an answer to why this happens.

When a new page is published with quality content, the freshness of the content raises its ranking very rapidly. It takes Google a little while to investigate where the content came from. Time is taken to determine if the content is original, or just a way of retelling a story. After a few days or weeks, Google makes a decision which can affect the original ranking.
Matt Cutts highlights that Google is just taking its best guess based on the quality content, the keywords, and the freshness of the content. As time passes, more information becomes available. Everything then gets incorporated in the Page Rank algorithm. Ranks usually change over time as the content of the pages change, links change, and certain information becomes more popular than the other. The algorithm adapts to all these changes, and assess the web pages in order to provide the best results.

Reasons Why New Pages Rank Well Then Drop