Welcome, visitor! [ Register | Login

Premium WordPress Themes - AppThemes
Post an Ad

About Christi6247

Description

I’m in charge of following the trend of an auction portal. We are working on content and will increase the current url count from 600,000 to about 900,000. The fear is that by publishing all these new pages at the same time, the search engine may apply some sort of penalty. Are there any risks in publishing all those new URLs at the same time? So no, there is no penalty associated with submitting 300,000 - or any number of URLs - all at once. But yes, submitting that many URLs at one time can have unintended consequences. Everything I’m writing from this point forward is my educated opinion, based on things I’ve been told over many years by people who worked at Google and on my own experiences with larger sites.


Google’s algorithms note unusual patterns. Adding 50 percent more pages to a site in a day is unusual. But it doesn’t necessarily mean it’s something the site owner did. There are lots of reasons this could happen - the most common one being that a site has been hacked. Google has systems in place to identify this type of suspicious activity and apply other filters to it, because it’s dangerous to present searchers with a potentially hacked site. Sometimes it will trigger a manual review, which means a human will take a look at the situation and make a judgment call. I believe that in the machine learning world, there are likely fewer humans reviewing these anomalies than there used to be, but the machines apply the same constructs.


Some of these might be to flag content that seems to have lots of spammy links in it or appears to be too similar to other content. In my own experience, I once had a retail client with about 300,000 product SKUs that acquired a competitor and added about 300,000 more product SKUs in a single day. We thought this might be a problem but did it anyway. The net result is that the new 300,000 SKUs took about 12 days to be indexed. And they didn’t happen a few at a time, the pages were "held" for those 12 days and then all indexed at once.


I assume that this delay was due to a manual review. The main site did not suffer any problems or losses during those 12 days, and once the new products were indexed, they performed similarly to the other products on the site. This was a couple of years ago, so don’t assume that your pages will be held 12 days now. There’s really no telling for sure. Why Are You Adding So Many Pages at Once? The elephant in the room, though, is why are you adding so many pages at once? Presumably, it would take months and even years to rewrite all of those pages manually.


So why would you hold your potential success until they are all done instead of rolling them out as they’re finished (one category at a time, perhaps)? If the answer is that really what you’re doing is spinning content or using one of the many commercially available products designed to automate content, tread carefully. I’m definitely seeing a high volume of automated content these days. Some of them are clever, using database inputs like number of products or including a list of the top x products in the category. Some of them even use two or three different templates across each category.


Automated content is performing quite well in search right now, but you can be certain it’s got an egg-timer on it. If these pages don’t actually help visitors find things or complete tasks, they’re not helpful; they’re spam. And while it may not seem like it while you’re ranking well and fat and happy, Google’s got you in their crosshairs. Penalty, Or Just Bad Choices? Google doesn’t care how many pages you have on your website if they’re pages that offer value. But if you’re trying to get away with something, try not to make it obvious by launching all the spam at once. Google might suppress or filter these pages now or in the future. Or your rankings may get an algorithmic downgrade. But it doesn’t really matter what you call it - it’s a penalty.


We’ve already seen some of the benefits this can offer in the form of answer boxes, knowledge panels, and more diverse search results for broad tail queries. In fact, content relevance to user intent can be argued to be its most important ranking factor because if your content is not relevant to a search than it will be devalued. Understand the intent of your keywords (informational, shopping, navigational). Analyze the SERP of these keywords and see what type of content is ranking. Research semantic similarities to that keyword and optimize content around those terms. Deep or long-form content addresses as many user concerns as possible, while providing fresh perspectives over a topic.


Even search engines seem to prefer long-form content for many informational user searches. A HubSpot study found that content between 2,250 and 2,500 words tended to receive the most organic traffic. This seems to be the sweet spot for SEO, although creating pages much longer than 2,500 words, when necessary, can also be beneficial. Becoming a master over your subject matter isn’t just beneficial for SEO, it can also help you become a thought leader in your industry and create additional business opportunity. Research top ranking pages for a target keyword and analyze their content. Add semantically related keywords to flesh out content with additional sub-topics. Answer any and all questions users may have about that topic. SEO tags still play an important role in content creation, despite the rise of semantic analysis.


Communicating the intent and syntax of your webpage document. Organizing your document to make it easier for users and search engines to read. Making pages more scannable. Helping your page pass the 5-second rule. Insert focus keywords into title tags, URL slug, and page titles. Create header sections (H2, H3, H4s) using related keywords. Ultimately, we design websites for both people and search engines. When designing for users, it’s always good to look at your website and website content from a fresh perspective. Mainly, how engaging is my content and am I already bored with my site? User engagement, or user signals, have long been suspected to be a ranking factor for Google, even if indirectly.


Regardless, user signals can be a good indicator of improvements that you need to make on your website. The Pages per Session metric indicates how many pages a user views before leaving your site. This metric, along with average session duration (the amount of time a user spends on your site), can be found in Google Analytics. What this metric tells you is how interactive and engaging your site is, from a navigational perspective. Analyzing this, along with your behavioral flow, can shed some light on holes impacting your sales funnel or impeding conversions. It can also show you how interactive and engaging your blog or news articles are. Usually, if a reader is consuming multiple articles in one session on your site, it means you are doing something right to satisfy their intent.


Analyze pages with high bounce rates and search for opportunities to encourage longer session durations or more pages-per-session. Insert calls-to-action on pages to encourage conversions. Provide additional navigation options within content, such as placing interlinks in body content or providing related reading materials. Bounce rate is another confusing metric that could either be positive or negative, depending on how you look at it. Ultimately, your bounce rate indicates how satisfied users are with your landing page or website. High bounce rates could indicate that your pages aren’t engaging and don’t satisfy user intent, especially for ecommerce pages. User bounces could also indicate that they are satisfied and got the answer they were looking for. Tell a story or lead with a compelling hook. Get rid of intrusive interstitials and pop-up advertisements.


Improve page load time. Ensure landing page copy is relevant to search queries. Your website listing is the first interaction a user has with your site. CTR in one indicator of whether that interaction was successful. A low CTR could indicate that your messaging is not relevant to a user search. It could also indicate that your meta description or title tag is not compelling enough. Insert exact match keywords into title tags and meta descriptions so they are bolded. Add a benefit of clicking on this page (e.g., "BOGO") into your meta description. Ensure your tags are the proper length so they don’t get truncated.


Next, we need to consider how our technical structure is impacting user engagement and our keyword rankings. Technical SEO could be considered the foundation of SEO where everything else is built on. Without a solid technical foundation, your house of content will crumble. To get indexed, your website needs to be crawled. Search engine crawlers only have access to the links provided in your sitemap and available from your homepage. This makes the practice of interlinking vastly important, which we will discuss later. For now, we will only concern ourselves with making sure our website is crawlable and out crawl budget is optimized.


Your crawl budget determines how many pages search engines will crawl during a crawl session. This is determined by your crawl rate and crawl demand. Crawl rate is a measurement of how many requests per second a search engine spider makes to your site, while crawl demand determines how often search engine spiders will crawl your site (depending on how popular it is). While most webmasters don’t worry about crawl budget, it’s a huge concern for larger sites. Crawl budgets allow webmasters to prioritize what pages should be crawled and indexed first, in case crawlers can parse through every pathway. Create a sitemap using your CMS or Screaming Frog and submit it manually through Google Search Console and Bing Webmaster Tools.


Block all pages you don’t want crawled or index by placing them under the disallow file of your robots.txt file. Clean up redirect chains and set parameters for dynamic URLs. Having an HTTPS secure website is very valuable for ensuring the security of transactions on your site. It’s also a soft ranking factor for Google. The number one technical error we find on clients sites is linking to mixed content or HTTP pages. This can occur during an SSL migration and arise from a number of causes. While pages should theoretically redirect to their HTTPS counterpart, it still isn’t advantageous to have links to mixed content.


More importantly, these links do not always redirect. Contact your hosting provider for any issues that persist with SSL certification and implementation. Run a crawl of your website using Screaming Frog to identify mixed content errors. Place sitemaps in your robots.txt file independent of any user-agent commands. Rewrite your .htaccess file to redirect all website traffic to a specific domain using the HTTPS URL. Equally as important, you don’t want content that links to broken or redirected pages. Not only can this affect speeds, but it can also impact indexation and crawl budgets. Status code issues may appear naturally over time or due to a site migration.


Generally, you want clean URL structures with status 200 codes. Run a crawl of your website using Screaming Frog to uncover 4xx and 5xx status codes. Use 301 redirects on broken pages to send users to a more relevant page. Implement custom 404 pages with available URLs to redirect traffic to relevant pages. Contact your web host provider for any 5xx errors impacting URLs. If technical SEO is the foundation of a website then internal links are the doors that allow you to move from room to room. But as websites grow older and businesses change, maintaining consistency across your site and a solid interlinking structure can be difficult. Deep linking has served as an SEO best practice since the dawn of the internet.


Essentially, the idea is to link to orphaned pages on your site from a higher level category page to pass authority from one page to the other and also ensure that page gets indexed. Creating an organized interlinking structure around similar topics allows lower pages on your site to pull some authority from higher authority pages. It also provides users with additional actions to take on your site, such as reading more about a particular sub-topic or traveling to another section of your site. Conduct a crawl to identify orphaned pages that are not being indexed. Use links strategically within content to pass along authority and provide additional reading content (2 minimum per post).


All websites are comprised of a topic hierarchy that is designed to communicate to users and search engines the purpose of each section of the site. Go to a site like Search Engine Journal and you’ll see how the top navigation is designed to create a topic tree under the umbrella of digital marketing. Tags are even implemented to help organize content and readers understand the context of certain topics. Generally, your hierarchy should be designed from a top-down approach, allowing search engines to crawl and index certain pages under buckets or clusters. Conduct user research to see what customers are searching for. Use exact match keywords to optimize category pages and semantically related keywords for sub-category pages. Add breadcrumbs or links in footers for users to navigate back to a specific page. In the age of the mobile-first index, it’s absolutely crucial that your website is mobile friendly.


The mobile first index has become Google’s primary ranking index, meaning it is updated before its desktop index. When designing for a mobile user, it’s important to keep in mind the dimensions of the device itself, as well as different considerations for surfing on a mobile device. For example, long-scrolls are preferable to links that force users to load another page and impede their surfing experience. But generally, the two most important mobile factors include mobile-friendly design and fast page speeds. Implement responsive web design. Tag pages with AMP code using your CMS. Improve page speeds by minifying onsite resources.


Jeffrey Cammack asked Google's John Mueller if it matters if someone uses American English versus Australian English. My understanding is it doesn't play any role for SEO -- maybe for users (and conversions), but not directly for SEO. You're probably overthinking it. I'd think that Google would be able to pick up on the differences in the English and just like they process different languages to rank better for different localized versions of Google, they might do the same here? So a web site written in Australian English would rank different in Google Australia than it would in Google USA?


SEO is a discipline that requires strategy and specific tactics over time to be successful. While audits and one-time optimizations can have some impact on organic positions, it is hard to find sustainable rankings, organic traffic, and conversion goal growth without some level of ongoing commitment. At a basic level, we know that search engines - especially Google - are constantly changing and adjusting the signals and variables in their algorithms. Sometimes we know what is coming and can plan for it if announced. In most cases, however, we don’t know about any algorithmic changes beforehand. Whenever we see a change in position, visibility, or traffic - either due to a change made by a search engine or due to something your competitor has done - we’re forced to be reactive.


But you must never forget the importance of being proactive in SEO. What follows are some of the most crucial daily, monthly, quarterly, and yearly SEO tasks you must pay attention to as part of a solid ongoing maintenance or management plan. Staying up to date on industry news is a critical aspect of SEO that must be built into any maintenance or ongoing management plan. This ranges from the mission-critical alerts and updates the search engines announce themselves, to keeping tabs on SEO best practices and breaking news from sources like Search Engine Journal. Big shifts in the industry are hard to miss.


But smaller, more subtle changes can become magnified when you miss them or best practices become outdated. Monitoring your key SEO performance metrics in real-time, or at least once per day, is especially necessary for brands and companies that rely on ecommerce transactions or lead volume to feed a sales team. Knowing how your website is performing in search through top-level metrics is important for recognizing any red flags. A specific or aggregate positioning drop. An organic traffic drop. A decrease in sales or lead volume. Being able to recognize problems as soon as they happen is key. You need to be able to diagnose issues and reverse any negative trends before they impact your overall marketing and business goals.


You can monitor less critical KPIs (any that don’t necessitate an immediate reaction) on a weekly basis. A solid SEO plan or campaign must have goals, a strategy, and specific tactics outlined. The daily process should include specific tasks, milestones, and achievable actions that work toward the bigger picture. The tactics can include things being done for the first time in a phased approach or action items that are more in a rinse and repeat methodology. Regardless, the list of specific technical, on-page, and off-page action items should be defined for the year, broken out into months, and further into tactics and progress that can be made on a daily basis to stay on track. SEO requires both big picture thinking and the ability to tackle the daily tasks and action items.

$22.00

4 Ways Sentiment Analysis Is Changing SEO

Sentiment analysis is a method used to analyze the positive and negative emotions associated with a particular series of words. The technology has evolved in […]

37 total views, 1 today