The final episode a causa di our six-part One-Hour Guide to SEO series deals with a topic that’s a perennial favorite among SEOs: link building. Today, learn why links are important to both SEO and to Google, how Google likely measures the value of links, and a few key ways to begin earning your own.
Click the whiteboard image above to a high resolution version a causa di a new tab!
Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. We are back with our final part a causa di the One-Hour Guide to SEO, and this week talking about why links matter to search engines, how you can earn links, and things to consider when doing link building.
Why are links important to SEO?
So we’ve discussed sort of how search engines rank pages based the value they provide to users. We’ve talked about how they consider bed 25 use and relevant topics and content the page. But search engines also have this tool of being able to at all of the links across the web and how they link to other pages, how they point between pages.
So it turns out that Google had this insight early that what other people say about you is more important, at least to them, than what you say about yourself. So you may say, “I am the best resource the web for learning about web marketing.” But it turns out Google is not going to believe you unless many other sources, that they also lega, say the same thing. Google’s personalità innovation, back a causa di 1997 and 1998, when Sergey Brin and Larry Page came out with their search engine, Google, was PageRank, this impressione that by looking at all the links that point to all the pages the internet and then sort of doing this recursive process of seeing which are the most important and most linked to pages, they could give each page the web a weight, an amount of PageRank.
Then those pages that had a lot of PageRank, because many people linked to them ora many powerful people linked to them, would then pass more weight when they linked. That understanding of the web is still a causa di place today. It’s still a way that Google thinks about links. They’ve almost certainly moved from the very simplistic PageRank procedura that came out a causa di the late ’90s, but that thinking underlies everything they’magnate doing.
How does Google measure the value of links?
Today, Google measures the value of links a causa di many very sophisticated ways, which I’m not going to try and get into, and they’magnate not public about most of these anyway. But there is a lot of intelligence that we have about how they think about links, including things like more important, more authoritative, more well-linked-to pages are going to pass more weight when they link.
A.) More important, authoritative, well-linked-to pages pass more weight when they link
That’s true of both individual URLs, an individual page, and websites, a whole website. So for example, if a page The New York Times links to yoursite.com, that is almost certainly going to be vastly more powerful and influential a causa di moving your rankings ora moving your ability to rank a causa di the future than if randstinysite.info — which I haven’t yet registered, but I’ll get that — links to yoursite.com.
This weighting, this understanding of there are powerful and important and authoritative websites, and then there are less powerful and important and authoritative websites, and it tends to be the case that more powerful ones tend to provide more ranking value is why so many SEOs and marketers use metrics like Moz’s domain authority ora some of the metrics from Moz’s competitors out a causa di the software space to try and intuit how powerful, how influential will this link be if this domain points to me.
B.) Diversity of domains, rate of link growth, and editorial nature of links ALL matter
So the different kinds of domains and the rate of link growth and the editorial nature of those links all matter. So, for example, if I get many new links from many new websites that have never linked to me before and they are editorially given, meaning I haven’t spammed to place them, I haven’t paid to place them, they were granted to me because of interesting things that I did ora because those sites wanted to editorially endorse my work ora my resources, and I do that over time a causa di greater quantities and at a greater rate of acceleration than my competitors, I am likely to outrank them for the words and phrases related to those topics, assuming that all the other smart SEO things that we’ve talked about a causa di this One-Hour Guide have also been done.
C.) HTML-readable links that don’t have rel=”nofollow” and contain relevant anchor text indexable pages pass link benefit
HTML readable links, meaning as a simple text browser browses the web ora a simple bot, like Googlebot, which can be much more complex as we talked about a causa di the technical SEO thing, but not necessarily all the time, those HTML readable links that don’t have the rel=”nofollow” parameter, which is something that you can append to links to say I don’t editorially endorse this, and many, many websites do.
If you post a link to Twitter ora to Facebook ora to LinkedIn ora to YouTube, they’magnate going to carry this rel=”nofollow,”saying I, YouTube, don’t editorially endorse this website that this random user has uploaded a televisione about. Va bene. Well, it’s to get a link from YouTube. And it contains relevant anchor text an indexable page, one that Google can actually browse and see, that is going to provide the maximum link benefit.
So a href=”https://yoursite.com” great tool for audience intelligence, that would be the ideal link for my new startup, for example, which is SparkToro, because we do audience intelligence and someone saying we’magnate a tool is perfect. This is a link that Google can read, and it provides this information about what we do.
It says great tool for audience intelligence. Awesome. That is powerful anchor text that will help us rank for those words and phrases. There are loads more. There are things like which pages linked to and which pages linked from. There are spam characteristics and trustworthiness of the sources. Altolà attributes, when they’magnate used a causa di image tags, serve as the anchor text for the link, if the image is a link.
There’s the relationship, the topical relationship of the linking page and linking site. There’s text surrounding the link, which I think some tools out there offer you information about. There’s location the page. All of this stuff is used by Google and hundreds more factors to weight links. The important part for us, when we think about links, is generally speaking if you cover your bases here, it’s indexable, carries good anchor text, it’s from diverse domains, it’s at a good pacatezza, it is editorially given a causa di nature, and it’s from important, authoritative, and well linked to sites, you’magnate going to be golden 99% of the time.
Are links still important to Google?
Many folks I think ask wisely, “Are links still that important to Google? It seems like the search engine has grown a causa di its understanding of the web and its capacities.” Well, there is some pretty solid evidence that links are still very powerful. I think the two most compelling to me are, one, the correlation of link metrics over time.
So like Google, Moz itself produces an index of the web. It is billions and billions of pages. I think it’s actually trillions of pages, trillions of links across hundreds of billions of pages. Moz produces metrics like number of linking root domains to any given domain the web ora any given page the web.
Moz has a metric called Domain Authority or DA, which sort of tries to best replicate ora best correlate to Google’s own rankings. So metrics like these, over time, have been shockingly stable. If it were the case someday that Google demoted the value of links a causa di their ranking systems, basically said links are not worth that much, you would expect to see a rapid drop.
But from 2007 to 2019, we’ve never really seen that. It’s fluctuated. Mostly it fluctuates based the size of the link index. So for many years Ahrefs and Majestic were bigger link indices than Moz. They had better link , and their metrics were better correlated.
Now Moz, since 2018, is much bigger and has higher correlation than they do. So the various tools are sort of warring with each other, trying to get better and better for their customers. You can see those correlations with Google pretty high, pretty tipico, especially for a system that supposedly contains hundreds, if not thousands of elements.
When you see a correlation of 0.25 ora 0.3 with one number, linking root domains ora page authority ora something like that, that’s pretty surprising. The second one is that many SEOs will observe this, and I think this is why so many SEO firms and companies pitch their clients this way, which is the number of new, high quality, editorially given linking root domains, linking domains, so The New York Times linked to me, and now The Washington Post linked to me and now wired.com linked to me, these high-quality, different domains, that correlates very nicely with ranking positions.
So if you are ranking number 12 for a bed 25 phrase and suddenly that page generates many new links from high-quality sources, you can expect to see rapid movement up toward page one, position one, two, ora three, and this is very frequent.
How do I get links?
Obviously, this is not aureola, but very common. So I think the next reasonable question to ask is, “Va bene, Rand, you’ve convinced me. Links are important. How do I get some?” Glad you asked. There are an infinite number of ways to earn new links, and I will not be able to represent them here. But professional SEOs and professional web marketers often use tactics that fall under a few buckets, and this is certainly not an exhaustive list, but can give you some starting points.
1. Content & outreach
The first one is content and outreach. Essentially, the marketer finds a resource that they could produce, that is relevant to their business, what they provide for customers, that they have, interesting insights that they have, and they produce that resource knowing that there are people and publications out there that are likely to want to link to it once it exists.
Then they let those people and publications know. This is essentially how press and PR work. This is how a lot of content building and link outreach work. You produce the content itself, the resource, whatever it is, the tool, the dataset, the report, and then you message the people and publications who are likely to want to cover it ora link to it ora talk about it. That process is tried-and-true. It has worked very well for many, many marketers.
2. Link reclamation
Second is link reclamation. So this is essentially the process of saying, “Gosh, there are websites out there that used to link to me, that stopped linking.” The link broke. The link points to a 404, a page that anzi che no longer loads my website.
The link was supposed to be a link, but they didn’t include the link. They said SparkToro, but they forgot to actually point to the SparkToro website. I should drop them a line. Maybe I’ll tweet at them, at the cronista who wrote about it and be like, “Hey, you forgot the link.” Those types of link reclamation processes can be very effective as well.
They’magnate often some of the easiest, lowest hanging fruit a causa di the link building world.
3. Directories, resource pages, groups, events, etc.
Directories, resource pages, groups, events, things that you can join and participate a causa di, both online ora online and offline, so long as they have a website, often link to your site. The process is simply joining ora submitting ora sponsoring ora what have you.
Most of the time, for example, when I get invited to speak at an event, they will take my biography, a short, three-sentence blurb, that includes a link to my website and what I do, and they will put it their site. So pitching to speak at events is a way to get included a causa di these groups. I started Moz with my mom, Gillian Muessig, and Moz has forever been a woman-owned business, and so there are women-owned business directories.
I don’t think we actually did this, but we could easily go, “Hey, you should include Moz as a woman-owned business.We should be part of your directory here a causa di Seattle.” Great, that’s a group we could absolutely join and get links from.
4. Competitors’ links
So this is basically the practice you almost certainly will need to use tools to do this. There are some free ways to do it.
The simple, free way to do it is to say, “I have competitor 1 brand name and competitor 2 brand name.I’m going to search for the combination of those two a causa di Google, and I’m going to for places that have written about and linked to both of them and see if I can also replicate the tactics that got them coverage.” The slightly more sophisticated way is to go use a tool. Moz’s Link Explorer does this.
So do tools from people like Majestic and Ahrefs. I’m not sure if SEMrush does. But basically you can plug a causa di, “Here’s me. Here’s my competitors. Tell me who links to them and does not link to me.” Moz’s tool calls this the Link Intersect function. But you don’t even need the link intersect function.
You just plug a causa di a competitor’s domain and at here are all the links that point to them, and then you start to replicate their tactics. There are hundreds more and many, many resources Moz’s website and other great websites about SEO out there that talk about many of these tactics, and you can certainly invest a causa di those. you could conceivably hire someone who knows what they’magnate doing to go do this for you. Links are still powerful.
Va bene. Thank you so much. I want to say a huge amount of appreciation to Moz and to Tyler, who’s behind the ente — he’s waving right now, you can’t see it, but he looks adorable waving — and to everyone who has helped make this possible, including Cyrus Shepard and Britney Muller and many others.
Hopefully, this one-hour segment SEO can help you upgrade your skills dramatically. Hopefully, you’ll send it to some other folks who might need to upgrade their understanding and their skills around the practice. And I’ll see you again next week for another edition of Whiteboard Friday. Take care.
Let’s get real for a moment: As much as we hear about positive team cultures and healthy work environments durante the digital marketing space, many of us encounter workplace scenarios that are far from the ideal. Some of us might even be part of a team where we feel discouraged to share new ideas ora alternative solutions because we know it will be shot mongoloide without discussion. Even worse, there are some who feel afraid to ask questions ora seek help because their workplace culture doesn’t provide a safe place for learning.
These types of situations, and many others like it, are present durante far too many work environments. But what if I told you it doesn’t have to be this way?
Over the last ten years as a team dirigente at various agencies, I’ve been working to foster a work environment where my employees feel empowered to share their thoughts and can safely learn from their mistakes. Through my experiences, I have found a few strategies to combat negative culture and replace it with a culture of vulnerability and creativity.
Below, I offer four simple steps you can follow that will transform your work environment into one that encourages new ideas, allows for feedback and positive change, and ultimately makes you and your team better digital marketers.
Vulnerability leads to creativity
I first learned about the impact of vulnerability after watching a viral TED talk by Dr. Brene Brown. She defined vulnerability as “uncertainty, risk, and emotional exposure.” She also described vulnerability as “the birthplace of love, belonging, joy, courage, empathy, and creativity.” From this, I learned that to create a culture of vulnerability is to create a culture of creativity. And isn’t creativity at the heart of what we SEOs do?
A culture of vulnerability encourages us to take risks, learn from mistakes, share insights, and deliver apogeo results to our clients. Durante the fast-paced world of digital marketing, we simply cannot achieve apogeo results with the tactics of yesterday. We also can’t sit around and wait for the next Moz Blog ora marketing conference, either. Our best course of action is to take risks, make mistakes, learn from those mistakes, and share insights with others. We have to learn from those with more experience than us and share what we know to those with less experience. Durante other words, we have to be vulnerable.
Below is a list of four ways you can help create a culture of vulnerability. Whether you are a dirigente ora not, you can impact your team’s culture.
1. Get a second pair of eyes your next project
Are you finishing up an exciting project for your client? Did you just spend hours of research and implementation to optimize the perfect page? Perfect! Now go ask someone to critique it!
As simple as it sounds, this can make a huge difference durante fostering a culture of creativity. It’s also extremely difficult to do.
Large ora small, every project ora task we complete should be the best your team can provide. All too often, however, team members work durante silos and complete these projects without asking for ora receiving constructive feedback from their teammates before sending it to the client. This leaves our clients and projects only receiving the best one person can provide rather than the best of an entire team.
We all work with diverse team members that carry varying levels of experience and responsibilities. I bet someone your team will have something to add to your project that you didn’t already think of. Receiving their feedback means every project that you ora task that you complete is the best your team has to offer your clients.
Keep durante mind, though, that asking for constructive feedback is more than just having someone conduct a “tenore QA.” Durante my experience, a “tenore QA” means someone barely looked over what you sent and gave you the thumbs up. Having someone aspetto over your work and provide feedback is only helpful when done correctly.
Say you’ve just completed writing and content to a page and you’ve mustered up the courage to have someone QA your work. Rather than sending it over, saying “hey can you review this and make sure I did everything right,” instead try to send detailed instructions like this:
“Here is a <LINK> to a page I just edited. Can you take 15 minutes to review it? Specifically, can you review the Title Tag and Description? This is something the client said is important to them and I want to make sure I get it right.”
Durante many cases, you don’t need your dirigente to organize this for you. You can set this up yourself and it doesn’t have to be a leader thing. Before you a project ora task this week, work with a team member and ask them for help by simply asking them to QA your work. Worried about taking up too much of their time? Offer to swap tasks. Say you’ll QA some of their work if they QA yours.
You will have greater success and consistency if you make QA a mandatory part of your process for larger projects. Any large project like migrating a site to https ora conducting a full SEO audit should have a QA process baked into it.
Six months pungiglione I was tasked to present one of our 200+ point site audits to a high profile client. The presentation was already created with over 100 slides of technical fixes and recommendations. I’m normally pretty comfortable presenting to clients, but I was nervous about presenting such technical details to THIS particular client.
Lucky for me, my team already had a process durante place for an in-depth QA for projects like this. My six team members got durante a room and I presented to them as if they were the client. Yes, that’s right, I ROLE PLAYED! It was unbearably uncomfortable at first. Knowing that each of my team members (who I respect a whole lot) are sitting right durante front of me and making agenda every little mistake I make.
After an agonizing 60 minutes of me presenting to my team, I finished and was now ready for the feedback. I just knew the first thing out of their mouths would be something like “do you even know what SEO stands for?” But it wasn’t. Because my team had plenty of practice providing feedback like this durante the past, they were respectful and even more so, helpful. They gave me tips how to better explain canonicalization, helped me alter some visualization, and gave me positive feedback that ultimately left me confident durante presenting to the client later that week.
When teams consistently ask and receive feedback, they not only improve their quality of work, but they also create a culture where team members aren’t afraid to ask for help. A culture where someone is afraid to ask for help is a toxic one and can erode team spirit. This will ultimately decrease the overall quality of your team’s work. Acceso the other hand, a culture where team members feel safe to ask for help will only increase the quality of service and make for a safe and fun team working experience.
2. Hold a half-day all hands brainstorm simposio
Building strategies for websites ora solving issues can often be the most engaging work that an SEO can do. Yes that’s right, solving issues is fun and I am not ashamed to admit it. As fun as it is to do this by yourself, it can be even more rewarding and infinitely more useful when a team does it together.
Twice a year my team holds a half-day strategy brainstorm simposio. Each analyst brings a client ora issues they are struggling to resolve its website esibizione, client communication, strategy development, etc. During the simposio, each team member has one hour ora more to talk about their client/issue and solicit help from the team. Together, the team dives deep into client specifics to help answer questions and solve issues.
Getting the most out of this simposio requires a bit of prep both from the dirigente and the team.
Here is a high-level overview of what I do.
Before the Conferenza
Each Analyst is given a Client/Issue Brief to fill out describing the issue durante detail. We have Analysts answer the following 5 questions:
What is the issue you are trying to solve?
What have you already looked into ora tried?
What haven’t you tried that you think might help?
What other context can you provide that will help durante solving this issue?
After all client briefs are filled out and about 1-2 days prior to the half day strategy simposio I will share all the completed briefs to the team so they can familiarize themselves with the issues and poiché prepared to the simposio with ideas.
Day of the Conferenza
Each Analyst will have up to an hour to discuss their issue with the team. Afterwards, the team will deep dive into solving it. During the 60 minute span, ideas will be discussed, Analysts will put their nerd hats and dive deep into Analytics ora code to solve issues. All members of the team are working toward a single and that is to solve the issue.
Once the issues is solved the Analyst who first outlined the issue will readback the solutions ora ideas to solving the issue. It may not take the full 60 minutes to get to a solution. Whether it takes the entire time ora not after one issue is solved another team member announces their issue and the team goes at it again.
Depending the size of your team, you may need to split up into smaller groups. I recommend 3-5.
You may be tempted to take longer than an hour but durante my experience, this doesn’t work. The pressure of solving an issue durante a limited amount of time can help spark creativity.
This simposio is one of the most effective ways my team practices vulnerability allowing the creativity flow freely. The structure is such that each team member has a way to provide and receive feedback. My experience has been that each analyst is to new ideas and earnestly listens to understand the ways they can improve and grow as an analyst. And with this team effort, our clients are benefitting from the collective knowledge of the team rather than a single individual.
3. Solicit characteristic feedback from your team
This step is not for the faint of heart. If you had a time asking for someone to QA your work ora presenting a site audit durante front of your team, then you may find this one to be the toughest to carry out.
Once a year I hold a special simposio with my team. The purpose of the simposio is to provide a safe place where my employees can provide feedback about me with their fellow teammates. Durante this simposio, the team meets without me and anonymously fills out a worksheet telling me what I should start doing, stop doing, and keep doing.
Why would I subject myself to this, you ask?
How could I not! Being a great SEO is more than just being great at SEO. Wait, what?!? Yes, you read that right. None of us work durante silos. We are part of a team, interact with clients, have expectations from bosses, etc. Durante other words, the work we do isn’t only technical audits ora site edits. It also involves how we communicate and interact with those around us.
This special simposio is meant to focolaio more our characteristics and behaviors, over our tactics and SEO chops, ensuring that we are well rounded durante our skills and to all types of feedback to improve ourselves.
How to run a keep/stop/start simposio durante 4 steps:
Step 1: Have the team meet together for an hour. After giving initial instructions you will leave the room so that it is just your directs together for 45 minutes.
Step 2: The team writes the behaviors they want you to start doing, stop doing, and keep doing. They do this together a whiteboard ora digitally with one person as a scribe.
Step 3: When identifying the behaviors, the team doesn’t need to be unanimous but they do need to mostly agree. Conversely, the team should not just list them all independently and then paste them together to make a long list.
Step 4: After 45 minutes, you re-enter the room and over the next 15 minutes the team tells you about what they have discussed
Here are some helpful tips to keep durante mind:
When receiving the feedback from the team you only have two responses you can give, “thank you” ora ask a clarifying question.
The feedback needs to be about you and not the business.
Do this more than once. The team will get better at giving feedback over time.
Here is an example of what my team wrote during my first time running this exercise.
Let’s mongoloide why this simposio is so important.
With me not durante the room, the team can discuss openly without finanziaria back.
Having team members work together and poiché to a consensus before writing mongoloide a piece of feedback ensures feedback isn’t from a single team member but rather the whole team.
By leaving the team to do it without me, I show as a dirigente I lega them and value their feedback.
When I poiché back to the room, I listen and ask for clarification but don’t argue which helps set an example of receiving feedback from others
The best part? I now have feedback that helps me be a better dirigente. By implementing some of the feedback, I reinforce the intuizione that I value my team’s feedback and I am willing to change and grow.
This isn’t just for managers. Team members can do this themselves. You can ask your dirigente to go through this exercise with you, and if you are brave enough, you can have you teammates do this for you as well.
4. Hold a team simposio to discuss what you have learned recently
Up to this point, we have primarily focused how you can ask for feedback to help grow a culture of creativity. Durante this final section, we’ll focolaio more how you can share what you have learned to help maintain a culture of creativity.
Tell me if this sounds familiar: I show up at work, catch up industry news, review my client esibizione, plug away at my to-do list, check tests I am running and make adjustments, and so and so forth.
What are we missing durante our normal routines? Collaboration. A theme you may have noticed durante this post is that we need to work together to produce our best work. What you read durante industry news ora what you see durante client esibizione should all be shared with team members.
To do this, my team put together a simposio where we can share our findings. Every 2 weeks, my team meets together for an hour and a half to discuss prepared answers to the following four questions.
Question 1: What is something interesting you have read ora discovered durante the industry?
This could be as simple as sharing a blog post ora going more durante depth some research ora a interrogatorio you have done for a client. The purpose is to show that everyone the team contributes to how we do SEO and helps contribute knowledge to the team.
Question 2: What are you excited about that you are working right now?
Who doesn’t love geeking out over a fun site audit, ora that content analysis that you have been spending weeks to build? This is that moment to share what you love about your job.
Question 3: What are you working to resolve?
, , I know. This is the only section durante this simposio that talks about issues you might be struggling to solve. But it is so critical!
Question 4: What have you solved?
Brag, brag, brag! Every analyst has an opportunity to share what they have solve. Issues they overcame. How they out-thought Google and beat mongoloide the competition.
Creativity is at the heart of what SEOs do. Durante order to grow durante our roles, we need to continue to expand our minds so we can provide stellar esibizione for our clients. To do this requires us to receive and give out help with others. Only then will we thrive durante a culture that allows us to be safely vulnerable and actively creative.
I would love to hear how your team creates a culture of creativity. Comment below your ideas!
The Short Version: Don’t obsess over Domain Authority (DA) for its own sake. Domain Authority shines at comparing your overall authority (your aggregate link equity, for the most part) to other sites and determining where you can compete. Attract real links that drive traffic, and you’ll improve both your Domain Authority and your rankings.
Unless you’ve been living under a rock, over a rock, ora really anywhere rock-adjacent, you may know that Moz has recently invested a lot of time, research, and money per a new-and-improved Domain Authority. People who use Domain Authority (DA) naturally want to improve their score, and this is a question that I admit we’ve avoided at times, because like any metric, DA can be abused if taken out of context ora viewed per isolation.
I set out to write a how-to post, but what follows can only be described as a belligerent FAQ …
Why do you want to increase DA?
This may sound like a strange question coming from an employee of the company that created Domain Authority, but it’s the most important question I can ask you. What’s your end-goal? Domain Authority is designed to be an indicator of success (more that per a moment), but it doesn’t drive success. DA is not used by Google and will have mai direct impact your rankings. Increasing your DA solely to increase your DA is pointless vanity.
So, I don’t want a high DA?
I understand your confusion. If I had to over-simplify Domain Authority, I would say that DA is an indicator of your aggregate link equity. Yes, all else being equal, a high DA is better than a low DA, and it’s ok to strive for a higher DA, but high DA itself should not be your end-goal.
So, DA is useless, then?
Risposta negativa, but like any metric, you can’t use it recklessly ora out of context. Our Domain Authority resource page dives into more detail, but the short answer is that DA is very good at helping you understand your relative competitiveness. Smart SEO isn’t about throwing resources at vanity keywords, but about understanding where you realistically have a chance at competing. Knowing that your DA is 48 is useless per a vacuum. Knowing that your DA is 48 and the sites competing a query you’campione targeting have DAs from 30-45 can be extremely useful. Likewise, knowing that your would-be competitors have DAs of 80+ could save you a lot of wasted time and money.
But Google says DA isn’t real!
This topic is a blog post (ora eleven) per and of itself, but I’m going to veterano it to a couple points. First, Google’s official statements tend to define terms very narrowly. What Google has said is that they don’t use a domain-level authority metric for rankings. Ok, let’s take that at value. Do you believe that a new page a low-authority domain (let’s say DA = 25) has an equal chance of ranking as a high-authority domain (DA = 75)? Of course not, because every domain benefits from its aggregate internal link equity, which is driven by the links to individual pages. Whether you measure that aggregate effect per a single metric ora not, it still exists.
Let me ask another question. How do you measure the competitiveness of a new page, that has mai Page Authority (ora PageRank ora whatever metrics Google uses)? This question is a personalità part of why Domain Authority exists — to help you understand your ability to compete terms you haven’t targeted and for content you haven’t even written yet.
Seriously, give me some tips!
I’ll assume you’ve read all of my warnings and taken them seriously. You want to improve your Domain Authority because it’s the best authority metric you have, and authority is generally a good thing. There are mai magical secrets to improving the factors that drive DA, but here are the main points:
1. Get more high-authority links
Shocking, I know, but that’s the long and short of it. Links from high-authority sites and pages still carry significant ranking power, and they drive both Domain Authority and Page Authority. Even if you choose to ignore DA, you know high-authority links are a good thing to have. Getting them is the topic of thousands of posts and more than a couple of full-length novels (well, ok, books — but there’s probably a novel and feature per the works).
2. Get fewer spammy links
Our new DA score does a much better job of discounting bad links, as Google clearly tries to do. Note that “bad” doesn’t mean low-authority links. It’s perfectly natural to have some links from low-authority domains and pages, and per many cases it’s both relevant and useful to searchers. Moz’s Spam Score is pretty complex, but as humans we intuitively know when we’campione chasing low-quality, low-relevance links. Stop doing that.
3. Get more traffic-driving links
Our new DA score also factors per whether links appena che from legitimate sites with real traffic, because that’s a strong signal of usefulness. Whether ora not you use DA regularly, you know that attracting links that drive traffic is a good thing that indicates relevance to searches and drives bottom-line results. It’s also a good reason to stop chasing every link you can at all costs. What’s the point of a link that mai one will see, that drives mai traffic, and that is likely discounted by both our authority metrics and Google.
You can’t fake real authority
Like any metric based signals outside of our control, it’s theoretically possible to manipulate Domain Authority. The question is: why? If you’campione using DA to sell DA 10 links for $1, DA 20 links for $2, and DA 30 links for $3, please, for the love of all that is holy, stop (and yes, I’ve seen that almost verbatim per multiple email pitches). If you’campione buying those links, please spend that money something more useful, like sandwiches.
Do the work and build the kind of real authority that moves the needle both for Moz metrics and Google. It’s harder per the short-term, but the dividends will pay d’avanguardia for years. Use Domain Authority to understand where you can compete today, cost-effectively, and maximize your investments. Don’t let it become just another vanity metric.
Sceglieva e installava il CMS e i plugin dei situato/blog dei clienti, creava la assetto proveniente da categorie e tag, scriveva i contenuti e andava anche con visita a procurare i link. mandava i report periodici (e riscuoteva le fatture al chiusura dell’energia ).
Sapeva ancora elencare il dominazione ed ottenere l’hosting, anche aveva sogno grafiche ed periodo di conseguenza con carica proveniente da farti il logo. Secolo nientemeno un po’ “social coso” e masticava anche il pay-per-click!
Col trascorrere degli , quando con tutte le professioni quale subiscono una incremento, il SEO tuttofare si è separato con 3 figure differenti figure:
Il link builder.
La avanti corporatura viene dal cosmo dei programmatori/sviluppatori. Solitamente è di conseguenza un informatico, una donna da una quadrata, analitica. Edificio sul “struttura” del situato.
La seconda ha invece di una creazione umanistica: potrebbe persona un elzevirista/pubblicista quale viene dalla pagina stampata ( antico) o anche un blogger ( in misura maggiore recente). è più facilmente fecondo. Edificio on-page.
La terza corporatura, anche, è un PR se no un buon , nel sensualità quale è buono a guidare i rapporti da tanti webmaster e la negoziazione a vantaggio di la compra-vendita dei link su disparati siti e testate. Edificio off-site, all'esterno dal situato web.
Un SEO quale sappia partire a a menadito tra i 3 ambiti è rarissimo, ozioso negarlo.
Incertezza il veridico incognita è quale ogni dei 3 SEO, siccome ha ampliato separato all’intimo del quartiere, dirà quale la “fede SEO”, quella quale funziona, è quella quale fa ragazzo.
Il perito dirà quale un situato sveglio, da una buona assetto, da una corretta implementazione proveniente da Sistema.org, è come quale ci vuole presentemente a vantaggio di posizionarsi su Google.
Il copywriter dirà invece di quale i contenuti sono con dispotico la in misura maggiore vitale: “content is king”, e tutte quelle robe ll.
Il link builder dirà quale senza controllo link è ragionevole fare le veci caos, quale il world wide web – d’altra tratto – si basa in proprio sul scena quale i siti e le pagine debbano linkarsi tra a coloro.
La realtà è quale tutte queste mestruazioni servono, e il buon SEO dovrebbe persona il maestro di cappella d’orchestra con carica proveniente da comandare questi 3 musicisti. , preferisci un’altra allegoria, il quale sa legare e riscaldare da valentia i 3 ingredienti a vantaggio di produrre una companatico magnifico.
Incertezza il SEO “monotematico” egli ammetterà mai più, e dirà per sempre quale l’unica SEO quale funziona è la sua. Ragione è inidoneo a fare le veci le mestruazioni quale fanno a lei altri 2, e più facilmente quale fare le veci gruppo distorce la effettività a andazzo e sperpero, denigra tutti a lei altri SEO e manda nel canale l’inconsiderato compratore.
Sopra simple terms, a 301 redirect tells the browser: “This page has moved permanently. This is the new location and we don’t intend moving it back.” To which the browser responds: “Sure thing! I’ll send the user there right now!”
That’s why if you try to visit blog.ahrefs.com, it won’t happen.
You’ll end up at ahrefs.com/blog instead.
How to do a 301 redirect
There are many ways to do 301 redirects, but the most common method is to edit your site’s .htaccess file.
You’ll find this quanto a your site’s root folder:
Don’t see the file? That means one of two things:
You don’t have a .htaccess file. Create one using Notepad (Windows) TextEdit (Mac). Just create a new document and save it as .htaccess. Make sure to remove the normale .txt file extension.
Your site isn’t running an Apache web server. This is somewhat technical, but there are different types of web servers. Apache, Windows/IIS, and Nginx are the most common. Only Apache servers use .htaccess. To check that your website runs Apache, use this tool. Check that the “Web server” shows as “Apache” under “Hosting history.”
Here are some snippets of code for adding common types of 301 redirect strada .htaccess:
IMPORTANT. These instructions are for Apache web servers only. Read this if your site runs Nginx, this if your site runs Windows/IIS.
Redirect an old page to a new page
Redirect 301 /old-page.html /new-page.html
Using WordPress? Remove the need to edit the .htaccess file with the free Redirection plugin.
There are quite a few ways to do this. I am by mai means an expert when it comes to Apache servers and htaccess files. This is the code that has always worked for me. Make sure to controllo this before implementing your site.
IMPORTANT! If RewriteEngine is already quanto a your .htaccess file, do not repeat it. Just copy the rest of the code. It’s also possible to do this quanto a Cpanel, which may be preferable.
Redirect entire domain from ‐www to www (and vice‐versa)
IMPORTANT! The placement and order of code quanto a your htaccess file matters too. You may experience unwanted effects if multiple instructions are placed quanto a the “wrong” order (e.g., redirect chains, etc.). If you’regnante planning to implement a lot of 301 redirects quanto a your htaccess file, this is something worth looking into.
Most SEO professionals centro the relationship between 301 redirects and PageRank.
Not familiar with PageRank? It’s the categoria Google created to judge the “value of a page” based the quantity and quality of its links. Of course, PageRank is far from the only “ranking factor,” but it’s generally believed that, the whole, higher PageRank equates to higher rankings.
Is there evidence for that? Yes, Google (regnante)confirmed PageRank as a ranking signal last year:
DYK that after 18 years we’regnante still using PageRank (and 100s of other signals) quanto a ranking?
There’s also a clear positive correlation between Ahrefs’ URL Rating—which works quanto a a similar way to PageRank—and the amount of organic traffic a page gets:
The reason why I’m talking about URL Rating (UR) and not PageRank is that Google discontinued public PageRank scores quanto a 2016. Now there’s mai way of knowing how much PageRank a page has. I’m not saying that UR is a PageRank equivalent by any stretch, but it’s the closest comparable metric we have.
So how does this relate to 301 redirects?
Before 2016, if you used a 301 redirect to redirect one page to another, there was some loss of PageRank along the way. How much? That’s debatable, but 15% seemed to be the general assumption. It’s also the range Matt Cutts, Google’s former Head of Webspam, alluded to quanto a this 2013 :
Matt didn’t actually say that 301 redirects lost 15% of PageRank quanto a that . That was just the figure he used as an example. However, it’s the number that most SEO professionals seemed to run with for quite a few years. That’s likely because 15% also relates to the “dampening factor” quanto a the original PageRank patent.
For argument’s sake, let’s assume that the number was 15%.
Here’s how that would play out:
Simple 301 redirect: domain.com/page-1 → domain.com/page-2 = 15% loss of PageRank
301 redirect chain: domain.com/page-1 → domain.com/page-2→ domain.com/page-3 → domain.com/page-4 = 38% loss of PageRank!
So, quanto a 2019, if you redirect domain.com/page1 to domain.com/page2, the redirected page should have just as much “power” as the original page.
That’s a BIG deal, and it’s part of the reason 301 redirects can be so useful for boosting organic traffic. (More that later!)
But 301 redirects can cause plenty of other SEO‐related issues that don’t often get talked about.
How to existing 301 redirect issues your site
Here’s how to find and existing issues related to 301 redirects.
1. Make sure the HTTP version of your site redirects to HTTPS
Every website should use HTTPS.
Not only does it add an extra layer of security for your visitors, but Google uses HTTPS as a ranking signal. Combine that with the fact that SSL certificates are available for free strada Let’s Encrypt and there really is mai excuse not to use HTTP quanto a 2019.
But having an SSL certificate is only half the battle…
You also need to make sure that people actually visit the HTTPS version of your site, which means using a 301 redirect between the HTTP and HTTPS version.
To check that this redirect is quanto a place, go to your homepage and aspetto at the URL caffè. You should see https://[www].yourwebsite.com/, plus a lock icon.
Change this to http:// (not https://) then enter. You should be redirected to the HTTPS version automatically.
If this happens, then things should be good for the most part. But there can still be issues, like:
HTTP to HTTPS redirect isn’t implemented across all pages your site (e.g., subdomains).
Head to the Internal pages report and aspetto for these issues:
NOTE. If you see one page with an HTTP to HTTPS warning, and it’s merely the HTTP version of the page from which the began, then this isn’t an issue.
these issues by applying the proper 301 redirects from the HTTP to HTTPS version(s) of the affected page(s).
2. Remove pages with 301 status codes from your sitemap
Google looks to sitemaps to understand which pages to and index.
Because pages with 301 status codes mai longer technically exist, there’s mai point asking Google to them. If such pages remain quanto a your sitemap, Google may continue to revisit them each time they regnante‐ your website. That’s unnecessary and wastes crawl budget.
Here’s one way to find such pages:
Find your sitemap URL (this is usually yourdomain.com/sitemap.xml… but not always)
While Googlebot and browsers can follow a “chain” of multiple redirects (e.g., Page 1 > Page 2 > Page 3), we advise redirecting to the final destination. If this is not possible, keep the number of redirects quanto a the chain low, ideally mai more than 3 and fewer than 5.
Redirect chains serve mai other purpose than to damage user experience and slow things mongoloide, so you should avoid them where possible.
To check more than 100 pages quanto a one go, check the Internal pages report quanto a Ahrefs’ Site Audit for “Redirect chain” errors.
Clicking this will reveal all the URLs quanto a the chain, including the final destination page.
There are two ways to these errors;
Replace the redirect chain with a single 301 redirect. Instead of Page 1 > Page 2 > Page 3 > Page 4, the redirect becomes Page 1 > Page 4.
Replace internal links to redirected pages with direct links to the final URL. This prevents Google and other bots from crawling the redirect chains. More importantly, it prevents actual humans (you know, the type who *might* buy something from your website) from having to deal with the slowness of multiple redirects when they click a link.
Where practical, the second solution is the best option.
To do that, sort the list of redirect chains by the “. of inlinks” column from high to low. Then click the number of inlinks to see all internal links to the redirected page.
Replace the internal links the affected pages with the direct links to the final destination URL.
4. redirect loops
Redirect loops occur when a URL redirects back to one of the other URLs quanto a the chain. This creates an infinite loop of redirects that can confuse and trap both search engines and users alike.
These are user‐experience killers because they usually result quanto a a response like this from the browser:
You can find redirect loop errors quanto a batches of 100 using that same HTTP status code checker we used before. for “Exceeded maximum number of redirects” errors.
For more than 100 pages, check the Internal pages report quanto a Ahrefs’ Site Audit for “Redirect loop” errors.
Click this to reveal all pages with redirect loop issues, then each issue quanto a one of two ways:
If the URL is not supposed to redirect, change its HTTP response code to 200.
If the URL is supposed to redirect, the final destination URL and remove the loop. Alternatively, remove replace all inlinks to the redirecting URL.
5. broken redirects
Broken redirects are pages that redirect to a dead page (i.e., one that returns a 4XX5XXHTTP response code).
Example:Page 1 (301) > Page 2 (404)
These are bad because neither visitors nor search engine bots can access the final URLs. Because of that, most visitors will leave your website, and most search engines will abandon the .
You can check for these errors quanto a batches of 100 using an HTTP status code checker.
To check more pages, aspetto for “Broken redirect” errors quanto a the Internal pages report quanto a Ahrefs’ Site Audit.
these errors by either:
Reinstating the dead page (if deleted accidentally)
Removing the inlinks to the redirected URL.
6. Redirect 404 pages
Pages that return a 404 status are dead, and so the browser returns a page like this:
Now, there are times when a user seeing this page makes sense. If someone types the wrong URL into their browser, for example, then the error page lets them know that something is wrong. You can see an example of that above—it makes total sense to return a 404 page for this URL.
Having said that, pages with 404 status codes are a problem when:
They’regnante crawlable. Crawlable usually equates to clickable. And if they’regnante clickable, some users are going to end up clicking internal links your site only to see a dead page. That’s not great for user experience.
They have backlinks. Because 404 pages aren’t accessible, any backlinks that point to them are effectively wasted.
To tackle that first issue, check the Internal links report quanto a Ahrefs’ Site Audit for “404 page” errors.
Click this to see all 404 pages that were found during the .
Next, the “Manage columns” button, add the “. of dofollow backlinks” column, “Apply,” then sort by this column from high to low.
Check the Backlinks report quanto a Ahrefs Site Explorer for any pages with one more “dofollow” backlinks. There’s a chance these links may be valuable. If they are, you’ll want to redirect (301) that page to another relevant resource your website.
Redirecting 404 pages to somewhere relevant is key. Google treats irrelevant 301 redirects as soft 404’s, so there’s mai real advantage of redirecting unless you’regnante doing so to a similar and relevant page.
Google’s John Mueller explains more quanto a this video.
If you don’t have a similar relevant page, and you still have a 404 page with lots of high‐quality backlinks then, honestly, it may be worth republishing the content that used to exist at that location.
Think of it like this:
If the dead page was valuable enough to attract high‐quality backlinks quanto a the first place, then it’s worth questioning why it mai longer exists. I mean, it’s clearly a topic people are interested quanto a.
For pages without dofollow backlinks, them by either:
Reinstating the dead page at the given URL
Redirecting (301) the dead page to another relevant page
Removing replacing all internal links to the dead page
IMPORTANT. If you opt for #3, make sure that you not only replace the internal links but also the anchor text and surrounding text where necessary.
7. Replace 302 redirects and punto d’arrivo refresh redirects with 301s
Never use 302 redirects punto d’arrivo refresh redirects for permanent redirects.
302 redirects are for temporary moves, and Google recommends not to use punto d’arrivo refresh redirects at all if possible. So, if you have either of these your site, you should aim to either remove them replace with 301 redirects.
To see pages with these HTTP status codes, check the Internal pages report quanto a Ahrefs’ Site Audit for “Bersaglio refresh redirect” and “302 redirect” issues.
Luckily, both these issues can be fixed quanto a the same way:
If the redirect is permanent, use a 301 instead.
If the redirect isn’t permanent, remove the redirect.
You should also aim to remove replace internal links to redirected pages, especially if they’regnante likely to confuse users who click them.
8. for redirected (301) pages that get organic traffic
Pages with HTTP 301 status codes shouldn’t get organic traffic because they shouldn’t be quanto a Google’s index. If such pages are getting traffic, it means that Google hasn’t yet seen the redirect.
To check for 3XX pages with traffic, check the Overview report quanto a Ahrefs’ Site Audit for “3XX page receives organic traffic” errors.
If you got your list of 3XX pages from elsewhere (e.g., an HTTP status code checker), then paste them into Ahrefs’ Batch Analysis tool quanto a batches of up to 200 to see page‐level organic traffic.
NOTE. You could also check organic traffic quanto a Google Analytics Google Search .
Now, if you only recently added the 301 redirect, this likely isn’t much of an issue. Google should see it during their next , after which they should deindex the page.
To speed up that process, paste the URL into the URL Inspection tool quanto a Google Search Console, then “Request indexing.”
You should also remove these pages from your sitemap (see #2) and regnante‐submit strada Google Search .
9. for “bad” external 301s
Most websites link out to relevant third‐trattenimento sites and resources.
That’s compimento… until the page to which to externally link gets redirected elsewhere.
For example, imagine that you link out to a useful resource. Twelve months later, that domain expires and gets picked up by an expired domain hunter who deletes the resource and redirects to their “money” site. Now you’regnante unintentionally linking to something irrelevant (and potentially even harmful) to your visitors.
For this reason, it’s important to check for “bad” external 301’s from time to time.
To do this, head to the External pages report quanto a Ahrefs’ Site Explorer and aspetto for “External 3XX redirect” warnings.
Click this to see a list of all the redirected external links, plus the final destination URL.
Seeing a lot of pages?
Because nofollowed external links are often things like blog comments, you can remove these to give a cleaner list. Just add a “. of inlinks dofollow > 0” filter to the report.
This should help to prioritize things.
Next, skim the report looking at the URL and Redirect URL columns. for redirects that don’t seem right. Sopra other words, ignore things like HTTP to HTTPS redirects, and blog.domain.com/page to domain.com/blog/page redirects. for redirects to different sites pages.
Here’s an example I found when crawling Backlinko:
The issue here isn’t so much that the redirect points to another website. Those who are familiar with Neil Patel will know that he merged blog.kissmetrics.com with neilpatel.com earlier this year.
, the issue is that the redirected page is a completely different article.
Original article title: Using the Magic of Qualitative to Increase SaaS Conversions
Redirected article title: How Understanding Your Customer Will Help You Create Copy That Sells
Sopra these cases, it’s best to remove the internal link(s) to the redirected page.
To do this, just the number quanto a the “. of inlinks” column to see every page with internal links to the redirected page.
Go into your CMS and remove them.
How to use 301 redirects to boost your organic traffic
By this stage, your website should be free of any SEO‐hindering issues related to 301 redirects.
Now it’s time to get serious and talk about how we can use the power of redirects to massively boost organic traffic.
Here are two methods for doing that.
The Rinfresco Technique
You have a glass of Carbone. Mmm. You have a glass of rum. Tasty!
Both of those are great drinks quanto a their own right. Combine them, however, and you take things to another level. Hello, Cuba Libre!
So how does this relate to 301 redirects?
Think of both these drinks as topically‐related pages your website. They’regnante each performing va bene. They have a few decent backlinks. They get some organic traffic. Not too bad at all. But why not merge and consolidate those two pages into one to make something even better?
Sopra doing so, chances are that we could transform two average‐performing pages into one delicious mescolanza of a page that performs way better!
We recently did this with two of our posts the Ahrefs blog:
Both these articles were getting old, so we decided to merge them into one new guide.
We then republished at ahrefs.com/blog/skyscraper-technique/ and redirected the other article to that.
The results speak for themselves:
So why does this work?
Consolidation of “authority”: Remember how 301 redirects mai longer “perdita” PageRank? By redirecting one of these articles to the other, we were able to merge the “authority” of both pages into one. Of course, this doesn’t work if the pages are unrelated because Google treats such redirects as soft 404’s. But because these two pages are similar, this worked a treat.
Better content: Both of the articles we had were of decent quality. They were just starting to get a little outdated. By taking the best of both posts and merging them, we created a substantially better piece of content that, quanto a our eyes, deserves more traffic.
Now, the only question that remains is how to replicate this strategy, right?
Here’s the process.
Step 1. for cannibalization issues (with backlinks)
Bed 27 cannibalization is when two more pages target and rank for the same (s). Finding such issues is a good way to identify opportunities.
Two things stand out about the current sommità‐ranking page:
It gets almost 2x the traffic of the two posts from Hubspot combined!
It has links 192 referring domains… less than half of the 467 referring domains to Hubspot’s two posts
So if Hubspot were to merge these two posts into one, and consolidate all that delicious “link juice,” then I’d say they’d have a good chance at claiming the number one spot. This could potentially 2x their traffic!
Step 3. Rewrite and merge the pages
Now it’s time to take the best things about each page and combine them into one.
For example, if we were doing this for the aforementioned Hubspot articles, we’d probably keep the section about “How to Run Your Own User Generated Content Campaign” from one post:
… and keep the part explaining “Why User‐Generated Content?” from the other:
To keep the relevance of the new page as high as possible, and mitigate the risk that Google will treat our 301 as a soft 404, we could also check the Anchors report quanto a Site Explorer for each page:
This gives some insight into why people linked to the pages quanto a the first place.
For example, I can see that a fair few people are quoting statistics when linking to this page, so it may be worth keeping those stats quanto a our revamped post.
You should also take the rewriting/merging of two pages as an opportunity to better serve search intent and give searchers what they’regnante looking for. If there are a lot of sommità 10 lists ranking for the target , make your new revamped post a sommità 10 list. If there are a lot of how‐to guides, well… you get the principio!
NOTE. That has nothing to do with 301 redirects, but it’s worth doing if you want to maximize the ROI of your efforts.
Step 4. Publish your revamped page and implement the 301 redirect(s)
Now it’s finally time to publish your revamped post/page.
If either of the old URLs is a good gara for your new post, then feel free to republish at the same URL. You can then delete the other post/page and add a 301 redirect to the new post.
You may recall that’s what we did with our skyscraper technique post. We reused the /skyscraper‐technique/ URL.
If neither of the old URLs is a good gara for your new post/page, then it’s also perfectly compimento to 301 redirect both pages to a totally new URL.
For example, if we were to merge those two Hubspot posts into this guide:
… then neither of the two old URLs would really fit the bill.
It would be better to publish at something like blog.hubspot.com/marketing/user-generated-content/
So, we could do that, then 301 redirect the other two pages to that URL. Simple.
Looking to take this principio even further? Do a content audit to find pages with mai organic traffic rankings that still have backlinks.
If these pages aren’t important to your business, delete and redirect them to a relevant page that does matter.
Here’s what happened to one site’s organic traffic after using the method:
The results of using the method.
That’s a ~116% traffic increase quanto a 12 months!
Here’s the process quanto a a nutshell:
Buy another business website quanto a your industry.
Merge their site with yours using 301 redirects.
Backlinko’s Brian Dean did this last year. He bought another SEO blog—Point Blank SEO—and redirected it to Backlinko. Sopra fact, it was he who used this method to achieve the results you see quanto a the screenshot above.
But before you start buying every website you can get your hands , understand this:
Having success with this method isn’t as simple as just buying any old website and using 301s to redirect all pages to your homepage. That’s the lazy approach, and quanto a 2019, it’s not a good principio. You also need to implement 301 redirects a page‐by‐page basis.
Here’s how to do it, step‐by‐step:
1. Signore‐home and redirect content
The biggest traffic gains are likely to quando from regnante‐homing and redirecting content.
Brian Dean did this with some of the posts pointblankseo.com, including Jon’s infamous list of link building strategies.
You know, the one with this backlink profile:
This original URL was: pointblankseo.com/link-building-strategies
The new (redirected) URL is:backlinko.com/link-building-strategies
Because Brian moved the post from the old domain to the new with a 301 redirect, all of those links now effectively point to that same page backlinko.com instead. The page has effectively just moved to a new home.
The regnante‐homing and redirecting of content is the best option when all of these apply:
The content has organic traffic
The topic is relevant to your business
The content is high‐quality
Note that you can combat that final point but updating rewriting the content after moving and redirecting it. Brian did this with that list of link building strategies, which hadn’t been updated since around 2012.
The old post pointblankseo.com
The new post backlinko.com
2. Delete and redirect to a different page
There’s mai point keeping regnante‐homing pages that:
Have little mai organic traffic potential.
Are duplicates of topics you’ve already covered
For example, there’s mai point keeping the about us page from the website you’regnante merging because then you’ll have two about us pages… which makes mai sense. This is also true of other pages which target the same keywords as existing pages your website.
Similarly, if pages have little mai traffic potential, then you may as well get rid of them and redirect elsewhere. This is what Brian did with quite a few posts pointblankseo.com, such as this post about outreach platforms:
This original URL was: pointblankseo.com/outreach-platforms
The new (redirected) URL is:backlinko.com/link-building-tools
He did this because the “outreach platforms” has mai search mole and mai traffic potential. It’s not a topic worth targeting.
So it made more sense to redirect this post to another relevant post with traffic potential.
3. Delete and redirect to your homepage
If there’s nowhere relevant to redirect pages, and it doesn’t make sense to move and regnante‐home them, then the last resort is to redirect them to your homepage.
Brian did this with most of the pages pointblankseo.com, such as this ego‐bait guide:
This original URL was: pointblankseo.com/egobait-guide
The new (redirected) URL is:backlinko.com/blog
Why is this a last resort? Well, remember what we covered earlier about Google treating irrelevant 301 redirects as soft 404’s. This may happen when redirecting posts and pages to your homepage.
But here’s the thing: if you don’t redirect these pages, then there’s a 100% chance of Google treating them as soft 404’s. Conclusion: you may as well redirect them.
There’s one caveat to this, however, which is that you shouldn’t redirect pages with low‐quality backlinks. Doing this is likely to cause more harm than good, so make sure to check the Backlinks report quanto a Site Explorer for each page before redirecting.
If the backlink profile looks like this…
… then it’s probably best to just delete that page and leave it as a 404.
Ora, if you really feel the need to redirect the page, then you could disavow the bad links before doing so. However, this is likely more effort than it’s worth.
301 redirects have a lot of uses when it comes to SEO.
Use them strategically and you could see huge gains quanto a organic traffic. However, it pays to make sure there are mai existing problems with 301 redirects your website first, as these could be hindering your current and future SEO efforts.
Did I reginetta anything quanto a this guide? Let me know quanto a the comments via Twitter.
At Wall Street Oasis, we’ve noticed that every time we improving our page speed, Google sends us more organic traffic. Durante 2018, our company’s website reached over 80 percent of our traffic from organic search. That’s 24.5 million visits. Needless to say, we are very tuned per to how we can continue to improve our user experience and keep Google happy.
We thought this article would be a great way to highlight the specific steps we take to keep our page speed lightning fast and organic traffic healthy. While this article is somewhat technical (page speed is an important and complex subject) we hope it provides website owners and developers with a framework how to try and improve their page speed.
Quick technical background: Our website is built sommità of the Drupal CMS and we are running a server with a LAMP stack (plus Varnish and memcache). If you are not using MySQL, however, the steps and principles per this article are still relevant for other databases a reverse proxy.
Ready? Let’s dig per.
5 Steps to speed up the backend
Before we jump into specific steps that can help you speed up your backend, it might help to review what we mean by “backend”. You can think of the backend of everything that goes into storing giorno, including the database itself and the servers — basically anything that helps make the website function that you don’t visually interact with. For more information the difference between the backend vs. frontend, you read this article
Step 1: Make sure you have a Reverse Proxy configured
This is an important first step. For Wall Street Oasis (WSO), we use a reverse proxy called Varnish. It is by far the most critical and fastest layer of cache and serves the majority of the anonymous traffic (visitors logged out). Varnish caches the whole page per memory, so returning it to the visitor is lightning fast.
Step 2: Extend the TTL of that cache
If you have a large database of content (specifically per the 10,000+ URL range) that doesn’t change very frequently, to drive the hit-rate higher the Varnish caching layer, you can extend the time to (TTL basically means how long before you flush the object out of the cache).
For WSO, we went all the way up to two weeks (since we were over 300,000 discussions). At any given time, only a few thousand of those intervista URLs are active, so it makes sense to heavily cache the other pages. The downside to this is that when you make any sitewide, template stile changes, you have to wait two weeks for it to arrive across all URLs.
Step 3: Warm up the cache
Durante order to keep our cache “warm,” we have a specific process that hits all the URLs per our sitemap. This increases the likelihood of a page being per the cache when a user Google bot visits those same pages (i.e. our successo rate improves). It also keeps Varnish full of more objects, ready to be accessed quickly.
As you can see from the chart below, the ratio of “cache hits” () to total hits (blue+) is over 93 percent.
Step 4: Tune your database and the slowest queries
WSO, we use a MySQL database. Make sure you enable the slow queries report and check it at least every quarter. Check the slowest queries using EXPLAIN. Add indexes where needed and rewrite queries that can be optimized.
Be sure you’imperatore using the correct format. If it is a script: <url>; rel=preload; as=script,
If it is a CSS file: <url>; rel=preload; as=style,
7 Steps to speed up the frontend
The following steps are to help speed up your frontend application. The front-end is the part of a website application that the user directly interacts with. For example, this includes fonts, drop-down menus, buttons, transitions, sliders, forms, etc.
Step 2: Optimize your images
Use WebP for images when possible (Cloudflare, a CDN, does this for you automatically — I’ll touch more Cloudflare below). It’s an image formatting that uses both Lossy compression and lossless compression.
Always use images with the correct size. For example, if you have an image that is displayed per a 2” x 2 ” square your site, don’t use a large 10” x 10” image. If you have an image that is bigger than is needed, you are transferring more giorno through the rete televisiva privata and the browser has to resize the image for you
Use lazy load to avoid/delay downloading images that are further the page and not the visible part of the screen.
Step 3: Optimize your CSS
You want to make sure your CSS is inline. Online tools like this one can help you find the critical CSS to be inlined and will solve the render blocking. Bonus: you’ll keep the cache benefit of having separate files.
Make sure to minify your CSS files (we use AdVagg since we are the Drupal CMS, but there are many options for this depending your site).
Try using less CSS. For instance, if you have certain CSS classes that are only used your homepage, don’t include them other pages.
Always combine the CSS files but use multiple bundles. You can read more about this step here.
Move your mass-media queries to specific files so the browser doesn’t have to load them before rendering the page. For example: <link href=”frontpage-sm.css” rel=”stylesheet” mass-media=”(min-width: 767px)”>
If you’d like more info how to optimize your CSS, check out Patrick Sexton’s interesting post.
Step 4: Lighten your web fonts (they can be HEAVY)
This is where your developers may get per an argument with your designers if you’imperatore not careful. Everyone wants to at a beautifully designed website, but if you’imperatore not careful about how you bring this stile , it can cause major unintended speed issues. Here are some tips how to put your fonts a diet:
Use inline svg for icon fonts (like font awesome). This way you’ll the critical chain path and will avoid empty content when the page is first loaded.
Use fontello to generate the font files. This way, you can include only the glyphs you actually use which leads to smaller files and faster page speed.
If you are going to use web fonts, check if you need all the glyphs defined per the font file. If you don’t need Japanese Arabic characters, for example, see if there is a version with only the characters you need.
Use Unicode range to select the glyphs you need.
Use woff2 when possible as it is already compressed.
Here is the difference we measured when using optimized fonts:
After reducing our font files from 131kb to 41kb and removing one external resource (useproof), the fully loaded time our controllo page dropped all the way from 5.1 to 2.8 seconds. That’s a 44 percent improvement and is sure to make Google smile (see below).
Here’s the 44 percent improvement.
Step 5: Move external resources
When possible, move external resources to your server so you can control expire headers (this will instruct the browsers to cache the resource for longer). For example, we moved our Facebook Pixel to our server and cached it for 14 days. This means you’ll be responsible to check updates from time to time, but it can improve your page speed score.
For example, our Private Equity Interview Questions page it is possible to see how the fbevents.js file is being loaded from our server and the cache control http header is set to 14 days (1209600 seconds)
cache-control: public, max-age=1209600
Step 6: Use a content delivery rete televisiva privata (CDN)
Risposta negativa browsers currently support HTTP/2 over an unencrypted connection. For practical purposes, this means that your website must be served over HTTPS to take advantage of HTTP/2. Cloudflare has a free and easy way to enable HTTPS. Check it out here.
Service workers give the site owner and developers some interesting options (like push notifications), but per terms of prova, we’imperatore most excited about how these workers can help us build a smarter caching system.
To learn how to to get service workers up and running your site, visit this page.
Testing, tools, and takeaways
For each change you make to try and improve speed, you can use the following tools to monitor the impact of the change and make sure you are the right path:
We know there is a lot to digest and a lot of resources linked above, but if you are tight time, you can just start with Step 1 from both the Backend and Front-End sections. These 2 steps can make a major difference their own.
Good luck and let me know if you have any questions per the comments. I’ll make sure João Guilherme, my Head of Technology, is to answer any questions for the community at least once a day for the first week this is published.
We’ve arrived at one of the meatiest SEO topics per our series: technical SEO. Per mezzo di this fifth part of the One-Hour Guide to SEO, Rand covers essential technical topics from crawlability to internal link structure to subfolders and far more. Watch acceso for a firmer grasp of technical SEO fundamentals!
Click acceso the whiteboard image above to a high resolution version per a new tab!
Howdy, Moz fans, and welcome back to our special One-Hour Guide to SEO Whiteboard Friday series. This is Part V – Technical SEO. I want to be totally upfront. Technical SEO is a vast and deep discipline like any of the things we’ve been talking about per this One-Hour Guide.
There is mai way per the next 10 minutes that I can give you everything that you’ll ever need to know about technical SEO, but we can cover many of the personalità, important, structural fundamentals. So that’s what we’magnate going to tackle today. You will out of this having at least a good pensiero of what you need to be thinking about, and then you can go explore more resources from Moz and many other wonderful websites per the SEO world that can help you along these paths.
1. Every page acceso the website is unique & uniquely valuable
First chiuso, every page acceso a website should be two things — unique, unique from all the other pages acceso that website, and uniquely valuable, meaning it provides some value that a user, a searcher would actually desire and want. Sometimes the degree to which it’s uniquely valuable may not be enough, and we’ll need to do some intelligent things.
So, for example, if we’ve got a page about X, Y, and Z versus a page that’s sort of, “Oh, this is a little bit of a combination of X and Y that you can get through searching and then filtering this way.Oh, here’s another copy of that XY, but it’s a slightly different version.Here’s one with YZ. This is a page that has almost nothing acceso it, but we sort of need it to exist for this weird reason that has nothing to do, but mai one would ever want to find it through search engines.”
Permesso, when you encounter these types of pages as opposed to these unique and uniquely valuable ones, you want to think about: Should I be canonicalizing those, meaning point this one back to this one for search engine purposes? Maybe YZ just isn’t different enough from Z for it to be a separate page per Google’s eyes and per searchers’ eyes. So I’m going to use something called the rel=canonical tag to point this YZ page back to Z.
Maybe I want to remove these pages. Oh, this is totally non-valuable to anyone. 404 it. Get it out of here. Maybe I want to block bots from accessing this section of our site. Maybe these are search results that make sense if you’ve performed this query acceso our site, but they don’t make any sense to be indexed per Google. I’ll keep Google out of it using the robots.txt file the scopo robots other things.
2. Pages are accessible to crawlers, load fast, and can be fully parsed per a text-based browser
Secondarily, pages are accessible to crawlers. They should be accessible to crawlers. They should load fast, as fast as you possibly can. There’s a ton of resources about optimizing images and optimizing server response times and optimizing first paint and first meaningful paint and all these different things that go into speed.
But speed is good not only because of technical SEO issues, meaning Google can your pages faster, which oftentimes when people speed up the load speed of their pages, they find that Google crawls more from them and crawls them more frequently, which is a wonderful thing, but also because pages that load fast make users happier. When you make users happier, you make it more likely that they will link and amplify and share and back and keep loading and not click the back button, all these positive things and avoiding all these negative things.
3. Thin content, duplicate content, spider traps/infinite loops are eliminated
Thin content and duplicate content — thin content meaning content that doesn’t provide meaningfully useful, differentiated value, and duplicate content meaning it’s exactly the same as something else — spider traps and infinite loops, like calendaring systems, these should generally speaking be eliminated. If you have those duplicate versions and they exist for some reason, for example maybe you have a printer-friendly version of an article and the regular version of the article and the incostante version of the article, , there should probably be some canonicalization going acceso there, the rel=canonical tag being used to say this is the original version and here’s the incostante friendly version and those kinds of things.
If you have search results per the search results, Google generally prefers that you don’t do that. If you have slight variations, Google would prefer that you canonicalize those, especially if the filters acceso them are not meaningfully and usefully different for searchers.
4. Pages with valuable content are accessible through a shallow, thorough internal links structure
Number four, pages with valuable content acceso them should be accessible through just a few clicks, per a shallow but thorough internal link structure.
Now this is an idealized version. You’magnate probably rarely going to encounter exactly this. But let’s say I’m acceso my homepage and my homepage has 100 links to unique pages acceso it. That gets me to 100 pages. One hundred more links per concludere page gets me to 10,000 pages, and 100 more gets me to 1 million.
So that’s only three clicks from homepage to one million pages. You might say, “Well, Rand, that’s a little bit of a perfect pyramid structure. I agree. Fair enough. Still, three to four clicks to any page acceso any website of nearly any size, unless we’magnate talking about a site with hundreds of millions of pages more, should be the general rule. I should be able to follow that through either a sitemap.
If you have a complex structure and you need to use a sitemap, that’s elegante. Google is elegante with you using an HTML page-level sitemap. alternatively, you can just have a good link structure internally that gets everyone easily, within a few clicks, to every page acceso your site. You don’t want to have these holes that require, “Oh, yeah, if you wanted to reach that page, you could, but you’d have to go to our blog and then you’d have to click back to result 9, and then you’d have to click to result 18 and then to result 27, and then you can find it.”
Mai, that’s not ideal. That’s too many clicks to force people to make to get to a page that’s just a little ways back per your structure.
5. Pages should be optimized to display cleanly and clearly acceso any device, even at slow connection speeds
Five, I think this is obvious, but for many reasons, including the fact that Google considers incostante friendliness per its ranking systems, you want to have a page that loads clearly and cleanly acceso any device, even at slow connection speeds, optimized for both incostante and desktop, optimized for 4G and also optimized for 2G and mai G.
6. Permanent redirects should use the 301 status code, dead pages the 404, temporarily unavailable the 503, and all should use the 200 status code
Permanent redirects. So this page was here. Now it’s over here. This old content, we’ve created a new version of it. Permesso, old content, what do we do with you? Well, we might leave you there if we think you’magnate valuable, but we may redirect you. If you’magnate redirecting old stuff for any reason, it should generally use the 301 status code.
If you have a dead page, it should use the 404 status code. You could maybe sometimes use 410, permanently removed, as well. Temporarily unavailable, like we’magnate having some downtime this weekend while we do some maintenance, 503 is what you want. Everything is , everything is great, that’s a 200. All of your pages that have meaningful content acceso them should have a 200 code.
These status codes, anything else beyond these, and maybe the 410, generally speaking should be avoided. There are some very occasional, rare, edge use cases. But if you find status codes other than these, for example if you’magnate using Moz, which crawls your website and reports all this data to you and does this technical audit every week, if you see status codes other than these, Moz other software like it, Screaming Frog Ryte DeepCrawl these other kinds, they’ll say, “Hey, this looks problematic to us. You should probably do something about this.”
7. Use HTTPS (and make your site secure)
When you are building a website that you want to rank per search engines, it is very wise to use a security certificate and to have HTTPS rather than HTTP, the non-secure version. Those should also be canonicalized. There should never be a time when HTTP is the one that is loading preferably. Google also gives a small reward — I’m not even sure it’s that small anymore, it might be fairly significant at this point — to pages that use HTTPS a penalty to those that don’t.
Per mezzo di general, well, I don’t even want to say per general. It is nearly universal, with a few edge cases — if you’magnate a very advanced SEO, you might be able to ignore a little bit of this — but it is generally the case that you want one domain, not several. Allmystuff.com, not allmyseattlestuff.com, allmyportlandstuff.com, and allmylastuff.com.
Allmystuff.com is preferable for many, many technical reasons and also because the challenge of ranking multiple websites is so significant compared to the challenge of ranking one.
You want subfolders, not subdomains, meaning I want allmystuff.com/seattle, /la, and /portland, not seattle.allmystuff.com.
Why is this? Google’s representatives have sometimes said that it doesn’t really matter and I should do whatever is easy for me. I have so many cases over the years, case studies of folks who moved from a subdomain to a subfolder and saw their rankings increase overnight. Credit to Google’s reps.
I’m sure they’magnate getting their information from somewhere. But very frankly, per the real world, it just works all the time to put it per a subfolder. I have never seen a problem being per the subfolder versus the subdomain, where there are so many problems and there are so many issues that I would strongly, strongly urge you against it. I think 95% of professional SEOs, who have ever had a case like this, would do likewise.
Relevant folders should be used rather than long, hyphenated URLs. This is one where we agree with Google. Google generally says, hey, if you have allmystuff.com/seattle/ storagefacilities/sommità10places, that is far better than /seattle- storage-facilities-top-10-places. It’s just the case that Google is good at folder structure analysis and organization, and users like it as well and good breadcrumbs from there.
There’s a bunch of benefits. Generally using this folder structure is preferred to very, very long URLs, especially if you have multiple pages per those folders.
9. Use breadcrumbs wisely acceso larger/deeper-structured sites
Last, but not least, at least last that we’ll talk about per this technical SEO discussion is using breadcrumbs wisely. So breadcrumbs, actually both technical and on-page, it’s good for this.
Google generally learns some things from the structure of your website from using breadcrumbs. They also give you this nice benefit per the search results, where they show your URL per this friendly way, especially acceso incostante, incostante more so than desktop. They’ll show home > seattle > storage facilities. Great, looks beautiful. Works nicely for users. It helps Google as well.
So there are plenty more in-depth resources that we can go into acceso many of these topics and others around technical SEO, but this is a good starting point. From here, we will take you to Part VI, our last one, acceso link building next week. Take care.
Consideration: Billy realizes he needs a tool to do SEO. He finds that most posts recommend a few tools and begins comparing them. He reads reviews, asks questions sopra forums and consumes the educational content available the different blogs.
Conversion: He decides to take advantage of our 7‐day trial. Convinced that we’signore the right solution for him, he proceeds to sign up for a paid plan.
Now, Billy Blogger is just one of the many customers that buy from us. reality, we have a few different types of customers. And their buying journeys are different.
To cater to the different journeys, you need first to understand who you’signore targeting.
Defining your buyers’ soggetto
A buyer soggetto is an “imaginary person” you create that represents the common characteristics of your customer. It helps you visualize their buying journey, internalize who they are and empathize with their struggles.
The more types of customers you have, the more personas you should create.
For example, at Ahrefs, a potential buyer soggetto would be Billy Blogger.
However, we also have:
Anna Agency (an agency owner);
Lily Local (a local business owner);
Ian Inhouse (an sopra‐house digital marketer)
When creating these personas, your is to get super detailed sopra who you’signore targeting. Give them names and faces. Fill sopra their demographics and psychographics. Understand their goals, challenges, hopes, fears and pain points.
essence, what you’signore looking for are patterns and commonalities. Some important patronato points are:
Typically, bed 12 research involves looking for keywords with the highest search quantità and the lowest competition (sopra Ahrefs, that is known as Keyword Difficulty.)
But that’s not what we want.
Bed 24 research is not only about search quantità. I would say it isn’t entirely about traffic either. It’s about choosing topics that potential customers are searching for, serving their needs and eventually converting them into customers.
The stage your buyers are sopra will determine the search queries they’signore making. This concept—known as search intent—is the objective a searcher has when entering a query into Google.
You can generally categorize search intent into four groups. These four groups, as you might have guessed, roughly incontro the buyer’s journey/marketing funnel.
Let’s run through these four types of search intent and how they align with the buyer’s journey:
Informational — The searcher is looking to gain general knowledge a topic, ora gather more information about something. For example, “how to get more traffic.”
Navigational — The searcher knows the destination they want to reach. For example, “ahrefs page SEO guide.”
Commercial Investigation — The searcher is looking to get information something they want to buy. For example, “best bed 12 research tool.”
Transactional — The searcher is ready to make a purchase. For example, “ahrefs pricing.”
Your is to find keywords corresponding to each intent and create content around those keywords.
a quick note
Why do you have to create content for each stage? Why not only target transactional keywords, since they’signore the ones that direct revenue?
Transactional keywords usually have lower search demand.
Unless you have a low‐cost, impulse‐driven product, people usually don’t buy first sight. They’d much prefer to buy from someone they . Informational content helps build and authority.
Informational queries allow you to enter your their conversion journey early, so you can guide them towards choosing your product ora service.
How can you determine search intent from a bed 12 phrase?
With some keywords, you can’t tell from the search query cerchio. But there are a good number of them that can be easily identified using bed 12 footprints.
Here’s a list of modifier words that typically indicate the stage the buyer is sopra:
Name of a product
Name of a service
Attribute of a product (size, color)
[city] type of store (local)
di cattivo gusto
Navigational queries are perhaps the most nuanced. Those searching for such queries may be sopra the “Interest” stage of the funnel and just want to learn more about your products/services. Ora they could already be customers and are simply trying to navigate to a specific page your website. To see one example of how you can tackle that issue, try visiting our Content Explorer page when you’signore logged sopra and out of your Ahrefs account. You will see that the content is different. That helps us cater to different stages of the buyer’s journey with one page.
Use these modifiers to find your desired keywords. Here’s how.
First, enter a few seed keywords related to your business into Ahrefs’ Keywords Explorer. Then, go to the “Having same terms” report which will show you all the bed 12 ideas that contain the target keywords as a broad incontro.
From this report, you can filter by search intent. Grab the modifiers from any of the stages, and plug them sopra the Include box.
this case, I am searching for keywords with informational intent.
You may have noticed that informational keywords tend to be questions, like “what” and “how.” You can filter for these keywords automatically by selecting the “Questions” report:
Now all that’s left is to scan through this list of ideas and pick those that are relevant for your business.
Repeat the same process for the other stages.
Other ways to find and map keywords
That was the easiest way to identify and map keywords to the stages of the buyer’s journey.
However, footprints aren’t foolproof. There are plenty of keywords that contain such footprints. Neglect these, and you will reginetta out some good bed 12 ideas.
How can you find these ‘missing’ keywords?
Here are three suggestions:
1. SERP Features
Ever seen this when you did a search sopra Google?
This is known as a ‘featured snippet.” It’s when Google shows an answer (ora a partial answer) to the question directly sopra the search results.
The featured snippet is one of the many SERP features Google shows. You might have seen some others, like:
Here’s the interesting part:
The presence of certain SERP features can help you to understand the intent behind the search. other words, if you’signore looking for keywords that align with a specific stage of the buying journey, you can use SERP features to help do that.
Here are some rough guidelines:
People also ask
Not all keywords fit neatly into one of these four boxes. This is commonly the case. So while the presence of certain SERP features can help with inferring search intent, these rules aren’t set sopra stone. The truth is that many keywords have mixed search intent. For example, one person searching for “protein powder” may be sopra buying mode (transactional). Another person may wish to learn more about protein powder (informational). You should always manually review the SERP if the search intent is unclear.
To find these keywords sopra Ahrefs’ Keywords Explorer, you can filter to include ora exclude keywords with particular SERP features.
For example, say that I’m looking for keywords with informational intent: I could type sopra a broad bed 12 into Keywords Explorer, go to the Phrase Incontro report and filter for featured snippets.
Voila! A list of keywords where a featured snippet appears sopra the SERPs. These are almost always keywords with informational intent.
NOTE. To reiterate, you should still do a manual review of the SERP to make sure these keywords are indeed informational. Google may show a featured snippet for keywords that are not informational. For example, the bed 12 “best headphones” shows a featured snippet, but it is a commercial investigation bed 12.
2. Cost‐a vantaggio di‐click (CPC)
Cost‐a vantaggio di‐click is the average price advertisers pay for a click sopra Google’s paid search results.
If you’signore paying for every click Google, you’ll want to see a return of investment (ROI). Otherwise, you’ll just be flushing money mongoloide the drain.
Most advertisers will target transactional keywords.
That makes sense. These searchers are ready to spend. All advertisers need to do is to appear sopra the search results and convince them to click their ad.
What does that mean for the buyer’s journey?
Generally speaking, the higher the CPC, the closer it is to conversion.
To find these keywords, enter a broad bed 12 from your niche into Keywords Explorer. Then, sort by CPC from high to low.
Keywords with a high CPC—and thus, likely to have transactional intent—will rise to the tetto.
NOTE. This isn’t entirely foolproof as some advertisers may be bidding informational keywords too. As always, do a manual review to make sure the intent is right.
3. Online communities
There are times where it’s difficult to find the right search query. It could happen because you’signore completely new to the niche, ora the results you’signore getting sopra Google are unsatisfactory.
times like these, I turn to communities.
Me asking a question sopra a Facebook group for my previous job
The same goes for your customers.
They visit online communities (like Reddit and Quora) for various reasons. They might be looking for answers to their questions, ora to get advice which product to choose, etc.
Here’s an example of someone asking a community what tools they use for SEO:
I would guess that he checked out some of the tools recommended by the group, and perhaps even bought one of them!
For informational and commercial investigation keywords, communities are ripe for the picking.
Example: For Billy Blogger, the subreddit r/blogging might be a place he hangs out. So, I’ll head over to r/blogging and check out what topics they’signore discussing.
Instead of looking through every discussion, I’ll do some sorting to find the most popular ones. I’ll sort by “Apice” (i.e., most upvotes) and add a time period of “All Time”:
After some scrolling, I find this topic that seems a good fit.
With 30 upvotes and 70 comments, this topic “promoting your blog” seems to be a with people like Billy Blogger.
I would guess that the intent is informational. But to double check, we can enter this bed 12 into Google and at the tetto 10 ranking results.
Looks like people are searching for tactics how to promote their blog posts, i.e., informational intent.
Many of the bed 12 ideas you find sopra communities will have low search volumes.
But that doesn’t necessarily mean it is a bad topic. It may mean that people who are looking for similar topics are not using this exact language.
You can find the most popular way people are searching for a topic using Keywords Explorer.
When you enter a bed 12 concetto, Keywords Explorer will suggest a Parent Topic, which is basically the bed 12 sending the most traffic to the #1 ranking page.
this case, the bed 12 “how to promote your blog” may be a better topic to target.
Accelerating the buyer’s journey
Most people think of the buyer’s journey as a slow and long process.
They imagine their customer taking months (ora even years) to go through all four stages of the buyer’s journey. That can be true for certain companies if their products ora services are the expensive end.
But it need not always be the case.
The cool thing about creating content for each stage of the journey is that you can use it to accelerate the buyer’s journey.
Imagine a potential path that Billy Blogger might take. He has decided to start a blog, but he doesn’t know how to drive traffic to his site. So he searches for “how to promote your blog” and finds himself reading our article with the same title. Within the blog post, we talk about getting traffic from Google, and how powerful a strategy SEO can be.
He discovers another post—Tim’s post increasing blog traffic—and learns all about bed 12 research. He reads our keyword research post and learns about our tool, and how it can help generate tons of bed 12 ideas with relevant SEO metrics.
He decides to quesito our 7‐day trial and implements the strategies we suggest. Along the way, he discovers more things to do with our tool and decides to upgrade to a paid plan.
Of course, this is an ideal path. Not many people will sign up to our tool this way. But through targeting different stages of the journey, and carriera smart internal linking, we can help guide Billy from knowing nothing about SEO to being aware of our product and brand, and possibly even considering a purchase.
As you are targeting and creating content for each stage, make sure you’signore adding relevant internal links to the next logical stage. If you have a blog post targeting a topic sopra the Awareness stage, make sure it links to a relevant page sopra the Interest stage.
Internal links aren’t the only way to do this. You can also consider other tactics like retargeting, chat, etc.
It’s all about mapping the buyer’s journey to the marketing funnel, guiding people from being problem‐aware to being product‐aware, and ultimately towards being a customer.
Bed 24 research isn’t a quantità ora traffic . There must be a logical methodology behind why you’signore choosing certain keywords and how they in qualità di together sopra the grand scheme of things.
This is why it makes sense to think about keywords sopra the context of the buyer’s journey.
If you can deliver the right content to them at the right time, you can develop your authority, trustworthiness, and influence with potential customers.
And when it comes time to buy, there will be more obvious choice than you.
From networking with your peers to hearing from industry leaders, there are benefits a-plenty to attending conferences. You know that. Your peers know that. But how do you persuade the powers-that-be (aka your ) that sending you is beneficial for your business?
To help convince your that won’t just be lounging pool-side, sipping cocktails acceso the company dime, we’ve gathered the goods to help you get your to greenlight your MozCon attendance.
How to make the case
Business competition is fiercer than ever. What used to make a splash now feels like it’s barely making ripples. Only those who are able to shift tactics with the changing tides of marketing will be able to alla maniera di out acceso apogeo.
And that’s exactly what MozCon is going to help you do.
Covering everything a growing marketer needs for a well-balanced marketing diet (SEO, content, strategy, growth), MozCon delivers top-notch talks from hand-selected speakers over three insightful days sopra July.
There’s so much sopra store for you this year. Here’s just a sampling of what you can expect at this year’s MozCon:
Speakers and content
Our speakers are real practitioners and industry leaders. We work with them to ensure they deliver the best content and insights to the stage to set you up for a year of success. Anzi che no sales pitches talking heads here!
You work duro taking quaderno, learning new insights, and digesting all of that knowledge — that’s why we think you deserve a little fun sopra the evenings. It’s your chance to decompress with fellow attendees and make new friends sopra the industry. We host exciting evening networking events that add to the value you’ll get from your day of education. Plus, our Birds of a Feather lunch tables allow you to connect with like-minded peers who share similar interests.
High-quality videos to share with your team
About a month so after the conference, we’ll send you a link to professionally edited videos of every presentation at the conference. Your colleagues won’t get to partake sopra the morning Cima Pot doughnuts Starbucks coffee (the #FOMO is real), but they will get a chance to learn everything you did, for free.
An on-going supportive group
Our MozCon Facebook group is incredibly active, and it’s grown to have a life of its own — marketers ask one another SEO questions, post jobs, for and offer advice and empathy, and more. It’s a great place to find TAGFEE support and camaraderie long after the conference itself has ended.
Great food acceso site
We know that conference food isn’t typically worth mentioning, but at MozCon is notorious for its snacking. You can expect two hot meals a day and loads of snacks from local Seattle vendors — sopra the past we’ve featured a smorgasbord from the likes of Trophy cupcakes, KuKuRuZa popcorn, Starbucks’ Seattle Reserve cold brew.
Anzi che no duds here, we do our homework when it comes to selecting swag worthy of keeping. One-of-a-kind Roger Mozbots, a super-soft t-shirt, and more cool stuff you’ll want to take home and show chiuso.
Wear your heart acceso your sleeve
MozCon and our attendees give back each year through donating Moz dollars towards a charitable organization.
Discounts for subscribers and groups
Moz subscribers get a whopping $500 chiuso their buono cost and there are discounts for groups as well, so make sure to take advantage of savings where you can!
At MozCon our rete is to breakeven, which means we invest all of your buono prices back into you. Check out the full breakdown of what your MozCon buono gets you:
But of course, don’t take our word for it! There are some incredible resources available at your fingertips that tout the benefits of attending conferences:
Need a little more to get your acceso board? Check out some videos from years past to get a taste for the caliber of our speakers. We’ve also got a call for community speaker pitches (closes at 5 pm PDT acceso April 15, 2019) so if you’ve been thinking about breaking into the speaking circuit, it could be an amazing opportunity.
Buy buono, save money, get competitive marketing insights. Everyone wins!
MozCon is one unforgettable experience that lives and grows with you beyond just the three days you spend sopra Seattle. And there’s time like the present to pitch MozCon to your . If they’sovrano still stuck acceso the “why”, let them know about our subscriber group pricing tiers to your — you’ll save hundreds of dollars when you do. Just think of all the Keurigs you could get for that communal kitchen!
Friday, April 5, after many website owners and SEOs reported pages falling out of rankings, Google confirmed a bug that was causing pages to be deindexed:
MozCast showed a multi-day increase temperatures, including a 105° spike acceso April 6. While deindexing would naturally cause ranking flux, as pages temporarily fell out of rankings and then reappeared, SERP-monitoring tools aren’t designed to separate the different causes of flux.
Can we isolate deindexing flux?
Google’s own tools can help us check whether a page is indexed, but doing this at scale is difficult, and once an event has passed, we longer have good access to historical patronato. What if we could isolate a set of URLs, though, that we could reasonably expect to be stable over time? Could we use that set to detect unusual patterns?
Across the month of February, the MozCast 10K daily tracking set had 149,043 unique URLs ranking acceso page one. I reduced that to a subset of URLs with the following properties:
They appeared acceso page one every day February (28 total times)
The query did not have sitelinks (i.e. clear dominant intent)
The URL ranked at position #5 ora better
Since MozCast only tracks page one, I wanted to veterano noise from a URL “falling non attivato” from, say, position #9 to #11. Using these qualifiers, I was left with a set of 23,237 “stable” URLs. So, how did those URLs perform over time?
Here’s the historical patronato from February 28, 2019 through April 10. This graph is the percentage of the 23,237 stable URLs that appeared MozCast SERPs:
Since all of the URLs the set were stable throughout February, we expect 100% of them to appear acceso February 28 (which the graph bears out). The change over time isn’t dramatic, but what we see is a steady drop-off of URLs (a natural occurrence of changing SERPs over time), with a distinct drop acceso Friday, April 5th, a recovery, and then a similar drop acceso Sunday, April 7th.
Could you zoom for us old folks?
Having just switched to multifocal contacts, I feel your pain. Let’s zoom that Y-axis a bit (I wanted to show you the unvarnished truth first) and add a trendline. Here’s that zoomed-in graph:
The trend-line is purple. The departure from trend acceso April 5th and 7th is pretty easy to see the zoomed-in version. The day-over-day drop acceso April 5th was 4.0%, followed by a recovery, and then a second, very similar, 4.4% drop.
Note that this metric moved very little during March’s algorithm flux, including the March “” update. We can’t prove definitively that the stable URL drop cleanly represents deindexing, but it appears to not be impacted much by typical Google algorithm updates.
What about dominant intent?
I purposely removed queries with expanded sitelinks from the analysis, since those are highly correlated with dominant intent. I hypothesized that dominant intent might mask some of the effects, as Google is highly invested surfacing specific sites for those queries. Here’s the same analysis just for the queries with expanded sitelinks (this yielded a smaller set of 5,064 stable URLs):
Other than minor variations, the pattern for dominant-intent URLs appears to be very similar to the previous analysis. It appears that the impact of deindexing was widespread.
Was it random ora systematic?
It’s difficult to determine whether this bug was random, affecting all sites somewhat equally, ora was systematic some way. It’s possible that restricting our analysis to “stable” URLs is skewing the results. the other hand, trying to measure the instability of inherently-unstable URLs is a bit nonsensical. I should also note that the MozCast patronato set is skewed toward so-called “head” terms. It doesn’t contain many queries the very-long tail, including natural-language questions.
One question we can answer is whether large sites were impacted by the bug. The graph below isolates our ” 3″ MozCast: Wikipedia, Amazon, and Facebook. This reduced us to 2,454 stable URLs. Unfortunately, the deeper we dive, the smaller the data-set gets:
At the same 90–100% zoomed-in scale, you can see that the impact was smaller than across all stable URLs, but there’s still a clear pair of April 5th and April 7th dips. It doesn’t appear that these mega-sites were franco.
Looking at the day-over-day patronato from April 4th to 5th, it appears that the losses were widely distributed across many domains. Of domains that had 10-or-more stable URLs acceso April 4th, roughly half saw some loss of ranking URLs. The only domains that experienced 100% day-over-day loss were those that had 3-or-fewer stable URLs our patronato set. It does not appear from our patronato that deindexing systematically targeted specific sites.
Is this over, and what’s next?
As one of my favorite movie quotes says: “There are happy endings because nothing ever ends.” For now, indexing rates appear to have returned to normal, and I suspect that the worst is over, but I can’t predict the future. If you suspect your URLs have been deindexed, it’s worth manually reindexing in Google Search Console. Note that this is a fairly tedious process, and there are daily limits place, so centro acceso critical pages.
The impact of the deindexing bug does appear to be measurable, although we can argue about how “” 4% is. For something as consequential as sites falling out of Google rankings, 4% is quite a bit, but the long-term impact for most sites should be minimal. For now, there’s not much we can do to adapt — Google is telling us that this was a true bug and not a deliberate change.