Blog

SEO Problems and Solutions to improve optimisation within phpBB3

Posted by Highway of Life in Modifications with the tags , , on August 23rd, 2008

Many people think that SEO is basically using or implementing Human Readable URLs. However, this is not encompassing of “SEO” at all, it is one small part of SEO, and is generally grossly misunderstood. Many people believe that Human Readable URLs are the only way that Search Engines can correctly index a site, or that it is the best method for spiders to index the site. Or that Dynamic URLs somehow hurt your Search Engine Ranking or performance within Search Engines. But this is not the case. Human Readable URLs mainly benefit Search Engines as added Keywords within the page, this only reinforces the keywords already set by the page and topic title, which Search Engines already use for the index.

Search Engines have just as easy of a time indexing dynamic URLs as they will with Human Readable static URLs. The benefit is minimal.

SEO is Dead! Long Live SEO!
There are others who believe that because the web has progressed so much over the years and the advancements in Search technologies, that search engines no longer require any optimisations to properly index your site. Or that you can not improve your search engine rankings or results by performing any sort of search engine optimisation on your site.

This post is meant to address both sides of the argument by giving people a better understanding of what SEO is and its place within phpBB3. It also includes problems identified within phpBB3 itself with regards to Search Engine performance and solutions to address these issues.


What is SEO?
SEO (Search Engine Optimisation) is defined as any action or adjustment, also known as Optimisation that you perform on your site for the purpose of improving the volume and quality of traffic to your site or board from search engines by utilising search results for targeted keywords, and can include one or many possible methods, from implementing Human Readable URLs and Keyword targeting, to Marketing, to adding a no-index page or directory to your robots.txt file.

All of this is Search Engine Optimisation (SEO).

How does phpBB3 Handle SEO?
phpBB3, out of the box, has good Search Engine Optimisation (SEO) capabilities. It handles BOT sessions appropriately, and it hides completely useless content such as forms, links to profiles, or links that spiders should not or could not access, also known as “dead links”, among a few other things. Some of these are just meant to improve the performance of spiders indexing the site, from not displaying useless content, which can cut down on the HTTP requests and bandwidth, to showing useless items such as forms. But that’s about the extent of it. There is so much more that one can do to optimise phpBB3 for best Search Engine capability, including many methods that I believe people  are not aware of yet.

What is the main issue of SEO within phpBB3?
Within phpBB3, the main issue with Optimisation is duplicate content. No, not the kind of duplicate content that will get you penalised or banned from Google (that’s a whole other post), but the kind of duplicate content that distorts search results and causes slightly higher bandwidth because the spider is indexing and re-indexing the same exact content as separate pages within it’s index. Then the search results for this single page instead display as multiple results of the exact same page, which defeats the purpose of good search results and degrades the effectiveness of those results.

How can this be improved? First the problem must be completely understood. When a spider crawls your board, from the index page, it looks at all the links: There are links to the Categories, Forums, Subforums, and also the last post of that particular forum. Upon entering a forum, it sees a list of links to topics, up to four pages within a single topic, and again the last post within that topic. Upon entering the topic, it will see a whole other page through the print option.

The idea is that we want the spider to index the topic in pages, and to do this, the spider has to see the exact same URL, dynamic or static of the pages for that topic. If it sees different URLs (such as those containing the p -post- variable), it considers the page a completely new page and it indexes it as such. But the last post URL on the index, and category views, as well as the last post within a specific topic shows the spider that there are far more pages to your forum than there really are. Thus causing multiple indexing of the same content.

Secondly, users will post links directly to a post within a topic, this URL may contain parameters other than the forum id, topic id and the start variable, which the spider should only recognising. It may see multiple parameters including: sort key, sort direction, order, session id, post id, print view, and highlighting.

Each new variable thrown in that is different from the last time it indexed this topic and page will mean a completely new page and thus another indexing pass by the spider. Further diluting search results for keywords or content for this page and again repeatedly consuming bandwidth over the same content.

The Solution for Duplicate Content within phpBB3
Now that we understand the problem, what is the solution?
There are two methods that can be used to improve the optimisation with the duplicate content issue within phpBB3.

First, by removing (hiding) the links to the last post within the topics and forums.
Second, by filtering out all parameters except for the topic_id and the start variable for bots only — and perhaps enforcing the forum_id, but it has to be consistent, one way or another. Remember that every variable added means a new page to the Search Engines including Google. — This type of change means basically redirecting the page on viewtopic for spiders if the parameters are not those strictly allowed, this kind of change is something that can increase the number of HTTP Requests if overused, but may be necessary to improve the optimisation, and due to the decreased repeated indexing, may result in fewer HTTP Requests, so it may even itself out in the end. But this change should only be performed for spiders and bots, as it would be a detriment to the user attempting to navigate your board if they experience this effect.

These changes will be far more beneficial for Search Engines than any Human Readable URLs change will have. Implementing Human Readable URLs can have a negative impact on this issue if Bots are still allowed to index the dynamic URLs, thus throwing in additional pages to what the Search Engines already see.

40 Responses to “SEO Problems and Solutions to improve optimisation within phpBB3”

Posted by Jwxie on August 23rd, 2008 at 6:20 pm:

very powerful article, highway of life
easy-understood
well, the middle part about how spiders work is a bit “confuse” for me since this is the first time i learn about the spider thing
but after the second time of reading this article
i fully understood
truly and sincerely appreciated.

Posted by Stokerpiller on August 24th, 2008 at 7:02 pm:

Hello Highway,
Nice article.

A couple of months ago I worked on a special Seo Botstyle.
But I ran into a problem, cloaking.
Hiding content only from bots is called cloaking.
Cloaking can result in removal from the search engines.

So there is reason to be carefull when hiding content from bots.

The solution to the cloaking problem would be using this:
[code]content google and guests doesn’t need to see[/code]
instead of this:
[code]content google doesn’t need to see[/code]
This is riskfree, but the downside is that it limits permissions for guests.

Posted by Mobile Ringtone on August 24th, 2008 at 8:49 pm:

Great Article. However there is another thing missing in phpBB 3 is a google sitemap and rss. I know there are mods available but none yet stable. I always felt these 2 should have been added in the basic package.

Just a thought.

Thanks

Posted by evil3 on August 25th, 2008 at 2:31 pm:

Another problem with duplicate content are page titles. In phpBB3 you will have the same page title for all pages in viewtopic and viewforum. Daniel (Greenweaver) has come up with a great solution for this! Namely: Page Title Number MOD.

Posted by Eric Martindale on August 25th, 2008 at 5:12 pm:

Great article, but I think phpBB3 missed the ball with SEO.

Firstly, it’s a bad idea to handle bots differently from actual users. The whole point of SEO is that if you make content easily accessible to the actual user, search engines will be able to index your content much more freely.

phpBB3 has an almost endless number of duplicate URLs for the same content. This makes board administrators jump through hoops (using robots.txt and 301 redirects) to get things even remotely acceptable.

Oh, and are you kidding me? Google’s Matt Cutts has covered the topic of dynamic URLs vs. Static URLs previously. The fact remains, having an optimized URL (with your keywords in the filename) will vastly improve your visibility in search engines.

Also, where is our sitemap and native RSS support? How can a lightweight community project like WordPress support this out of the box, and a much more feature-inclusive project like phpBB3 neglect such things? (RSS support, mainly.)

Posted by Highway of Life on August 25th, 2008 at 5:47 pm:

Stokerpiller wrote:
But I ran into a problem, cloaking.
Hiding content only from bots is called cloaking.
Cloaking can result in removal from the search engines.

That depends on what you hide. If you hide posts or topics, then you would be guilty of cloaking.
However, if you hide graphics, images, and style (CSS/backgrounds, etc), as well as forms and dead links — links that if Google followed them, would lead to a dead-end page due to permissions. This is not considered cloaking and will not get you removed from Google Search.
It is the same reason you would use robots.txt file to keep certain pages or directories from being indexed by Google.

It is also okay to hide content from Guests and only let Registered Users as well as Bots view the content. Many high-profile sites do this very thing.

Mobile Ringtone wrote:
Great Article. However there is another thing missing in phpBB 3 is a google sitemap and rss. I know there are mods available but none yet stable. I always felt these 2 should have been added in the basic package.

There are many features that everyone believes should be added to the basic package. However, people forget that phpBB is meant to be a discussion forum software, not a blog, or CMS, or other such system.
A blog has to be searchable as it contains content. Many forums contain simply chatter which search engines would view as mostly meaningless. However, there are those boards out there that contain valuable content to users when searching, and this is what we want to address when we optimise our site for those search engines.

Yes, RSS will be added, but it is not required for SEO. Same with a sitemap. It is not required.

@evil3, thanks for the link. Great MOD. 🙂

Eric Martindale wrote:
Firstly, it’s a bad idea to handle bots differently from actual users. The whole point of SEO is that if you make content easily accessible to the actual user, search engines will be able to index your content much more freely.

That is both true and untrue. 😉
It is not a bad idea to handle bots differently from actual users. First of all, it allows you to remove content that is useless to search/spider. It removes links that would be considered “dead links”, these are links that if Google followed, would lead to a dead-end page due to lack of permissions. Among many others.
But you are correct in that the point of SEO is to affect both the user and the search engines. Many people believe that the whole purpose is to only benefit the search engines. But this is untrue, it is meant to affect the user as well. What good are search results if they show the first 10 results to the exact same page?

Eric Martindale wrote:
hpBB3 has an almost endless number of duplicate URLs for the same content. This makes board administrators jump through hoops (using robots.txt and 301 redirects) to get things even remotely acceptable.

Now you’ve just proved why handling bots differently from users is necessary.
Those URLs are required for proper browsing for users, they need to be able to sort, order, show the print page, start on any page, only view a certain post, etc. But while this is required for best user experience, it has the negative impact on search engines as you mentioned, because now those search engines view those pages as completely unique when it should not. That was the whole point of my blog post

Eric Martindale wrote:
Also, where is our sitemap and native RSS support? How can a lightweight community project like WordPress support this out of the box, and a much more feature-inclusive project like phpBB3 neglect such things? (RSS support, mainly.)

You have to remember that blog software is used by bloggers to have content that would be beneficial for users and search engines, while bulletin board software generally contains mostly meaningless chatter that really should not be indexed at all.
Why is it missing? first of all, sitemap and RSS support is NOT REQUIRED for SEO. Most websites out there don’t even contain it.
Sure, both can be used to improve your visibility within search engines, but they are far from required.

Posted by Eric Martindale on August 25th, 2008 at 8:08 pm:

Highway of Life wrote:
It is not a bad idea to handle bots differently from actual users. First of all, it allows you to remove content that is useless to search/spider. It removes links that would be considered “dead links”, these are links that if Google followed, would lead to a dead-end page due to lack of permissions. Among many others.

Now you’ve just proved why handling bots differently from users is necessary.
Those URLs are required for proper browsing for users, they need to be able to sort, order, show the print page, start on any page, only view a certain post, etc. But while this is required for best user experience, it has the negative impact on search engines as you mentioned, because now those search engines view those pages as completely unique when it should not. That was the whole point of my blog post

These parameters should not be stored in URLs (keep them in user preferences!), and if they are – the default phpBB3 robots.txt should contain the necessary safeguards to prevent penalization for duplicate content. (Especially with session IDs!)

Take a look at an example in one of my robots.txt for phpBB3 – I’m not even sure I’ve gotten all of the duplicates. I probably wouldn’t have ground to stand on if the phpBB team provided ample documentation on the correct way to configure this!

Highway of Life wrote:You have to remember that blog software is used by bloggers to have content that would be beneficial for users and search engines, while bulletin board software generally contains mostly meaningless chatter that really should not be indexed at all.
Why is it missing? first of all, sitemap and RSS support is NOT REQUIRED for SEO. Most websites out there don’t even contain it.
Sure, both can be used to improve your visibility within search engines, but they are far from required.

RSS support aside (not really an SEO topic), sitemaps are vastly important for SEO. Search engines don’t specifically penalize sites with dynamic URLs, but they do have a much more difficult time spidering all of the URLs in the site (especially with the massive volume of URLs generated by duplicate content issues, outlined above)!

The solution to that problem (which, if phpBB3 were architectured differently, would not exist) would be to provide the search engines with a sitemap that prioritizes the content (raw topic URLs are more important than topic URLs with search parameters or sorting parameters).

Highway of Life wrote:
Most websites out there don’t even contain it.

Yes, and that’s why the people who do have it rise so much farther above these communities. SEO is good for you, and if you build the website with the user in mind, most of your SEO work will already be done: good linking hierarchy, user-accessible sitemaps to explain the site, link text that makes sense, among a great many other things. Design for the user, and SEO isn’t a concern.

On the note of WordPress – I don’t see forums and blogs being that different. The core of both is very much the same – being that there is a community and relationships that are built around the content. Both are an extension of the online experience, and I make a point to handle them both in a similar fashion. First came BBS systems, then Forums, then Blogs – it’s all the same progression of social media, and I think SEO has just as much a place in the forum space as it does in the bleeding edge of the blogging space.

Note: Also, it’d be nice to have a “preview” feature here on the blog, as I’m not currently sure if my quotes will work.

Posted by baldeagle on August 25th, 2008 at 8:42 pm:

Great article, but it seems we’re no closer to a real solution. I had thought permalinks would be a viable solution if you were able to ensure the link redirects worked properly. In the short run, unless these SEO issues are addressed, message boards will lack the power to set high in the competitive search areas. IMHO.

Posted by luchtzak on August 25th, 2008 at 9:02 pm:

This is a text from my wordpress blog:

By default WordPress uses web URLs which have question marks and lots of numbers in them, however WordPress offers you the ability to create a custom URL structure for your permalinks and archives. This can improve the aesthetics, usability, and forward-compatibility of your links. A number of tags are available, and here are some examples to get you started.

You have default, with day and name, month and name, numeric and a custom structure of your wishes. You can add extra tags if you desire!

I already mentioned it in the forums, url rewriting should be a standard feature of phpbb 3, fully adjustable in the ACP

RSS feeds is a higly demanded MOD, why not adding it into the standard phpbb 3 ? Disabling it in the ACP e.g.

Posted by Highway of Life on August 25th, 2008 at 9:16 pm:

I’ve fixed your quotes. BBCode is not allowed within the blog, you have to use HTML.
So use:

Code: Select all
<blockquote><strong>Highway of Life wrote:</strong>
text</blockquote>

Eric Martindale
These parameters should not be stored in URLs (keep them in user preferences!), and if they are – the default phpBB3 robots.txt should contain the necessary safeguards to prevent penalization for duplicate content. (Especially with session IDs!)

There is actually a good reason they have to be in URL’s instead of user preferences.

There are parameters and defaults dictated by the user preferences, but you may need to sort and or order a specific topic differently than all the rest.
Having these parameters in the URL gives the user total and complete control over how they would like to view the topic.
Taking these parameters away from the URLs and placing them in user preferences takes away the user control over viewing the topic. Something that we do not want to do, as it’s a step backwards.
Highlight, start, post_id, sid, print and others can never be dictated by user preference. It’s simply impossible. The SID for example, cannot be handled outside of the URL if it doesn’t make it into the cookies.
phpBB, and all bulletin board software for that matter, is designed specifically for the user. By doing it the way we are doing it, this gives the user the control they need to adequately browse the forum. If you remove that control, you end up frustrating the user.

Here is a perfect example where it was designed for the user, but is not that great for SEO. That cannot be helped, and that is why this discussion exists.
I think this discussion is a good example of why the proposed ideas while they will improve SEO, they will dramatically degrade the user experience. Which is contradictory to what you want to do.
So now you have to ask yourself, what is more important to me? the user experience, or to have better SEO handling?

Posted by Daniel on August 25th, 2008 at 11:18 pm:

I was searching for a method to prevent those duplicate indexings from Google and I found a set of robots.txt rules here : http://techtitbits.com/2008/07/prevent-duplicate-indexing-of-phpbb3-threads-by-google/

Seems like it’s working well. 🙂

Posted by Anderson on August 26th, 2008 at 2:21 pm:

Googlebot-Image does not obtain to find images postada in attachment mod of phpbb3, if it doubts me, shows to one forum phpbb3 to me that it has images attachment mod in this research:

http://images.google.com/imghp?gbv=2&hl=pt-BR&btnG=Pesquisar+imagens

Posted by Jaap on August 26th, 2008 at 4:18 pm:

@Eric Martindale

I fully agree with you.

Sitemaps do work!
WordPress perfectly has what phpBB3 misses.

MODS have to do the job again?

I really think that WordPress has a team considering more about what webmasters want.
The phpBB3- team made an excellent product to use, but forgot that without visitors forums are boring.

Posted by Highway of Life on August 26th, 2008 at 9:13 pm:

@baldeagle and luchtzak, thanks for your feedback.

The problem is the same as it has always been. Many people involved in SEO disagree on the exact implementation of SEO, or that it’s even necessary at all. But that’s why I wrote the post, I think it’s an issue that can/should be looked at seriously. But at the same time, many current SEO methods conflict with usability within forum software. Every bulletin board software suffers from this effect. The two weren’t meant to be perfectly married to each other. Sure, you can try, you can make it work, but not easily. It’s not like just creating RSS feeds.

phpBB3 took the route of maximum usability rather than maximum Search Engine optimisation. The goal is and was to make the forums easier for users to use.
Search engines are able to handle dynamic websites just fine these days, and certainly it’s proven by phpBB.com coming up so high in specific keyword searches. But is it perfect? no, and neither is blog software such as WordPress.

However, a blog software is very different in core functionality than a forum.
It’s simply a single page per post, and all comments are posted on that page. In reality, a blog is not dynamic in a sense that you cannot change views, look at it from different ordering or sort, it does not follow a thread of discussion, etc.

That certainly doesn’t mean that we don’t need to improve. We certainly do, but we also need to look at it from a usability perspective. Something that is often forgotten during the SE Optimisation process.

@luchtzak, there are many highly demanded MODs, or at least claims of highly demanded MODs. But the development team has to decide what is best for the software as well as trying to meet everyone’s needs. We can’t just put every MOD under the sun in phpBB3 because every other person believes their request is a highly demanded MOD.

That said, it has already been noted that RSS Feeds will be a default feature in phpBB 3.2 — But we have a very large MOD community simply for the reason that phpBB is meant to be expanded. And expand 3.0.x you can with an RSS MOD. There are plenty of them around. Same goes for a sitemap MOD.

@jaap, thanks for your feedback. Indeed, sitemaps do work. That has never been debated. But to say that it is required for SEO is incorrect, similarly for those that say that static URLs are required for SEO is also incorrect.
Yes, both methods do help SEO, there is no doubt about that, but there are so many things that a webmaster can do to improve SEO, and not every single method is required for every site and every administrator.
As I said before, bulletin boards are generally just used for bulletin boards, that is discussion. People don’t want to weed through discussions if they are looking for information on something. Of course, there are many exceptions, there is no rule.

Secondly, do not compare blog software with bulletin board software. They are inherently different. Just as CMS software cannot be compared to blog or bulletin board software. There are so many intricacies and differences between these software that people do not understand.

Posted by Anderson on August 27th, 2008 at 1:23 am:

Here in Brazil already vi many people to say that she does not use phpbb3 because it is bad for the seo, googlebot do not stop each link, now already thought all bots having that to work much more because of links duplicates

Posted by Anderson on August 27th, 2008 at 1:52 am:

Highway of Life, googlebot-image does not obtain to find this images
http://startrekguide.com/community/viewtopic.php?f=14&t=6000

in its fórum?

Posted by Emiel van Rossum on August 27th, 2008 at 2:46 pm:

Thats a coincidence, i just also wrote a blogpost about forum seo in general, 5 tips for forum seo, especialy targetting phpbb (dutch): http://www.emielvanrossum.nl/vijf-tips-voor-forum-seo/

You are correct that duplicate content is the main problem, but you are forgetting bad titles.
“phpBB • View topic – the whole topic title” is to much irrelevant info.

Posted by Anderson on August 28th, 2008 at 1:47 pm:

already I know where it is the error of links duplicates, he is f= of viewtopic.php

viewtopic.php?f=135&t=1520
viewtopic.php?f=4&t=1520
viewtopic.php?f=87&t=1520

– f=

viewtopic.php?t=1520

no link duplicate

Posted by Eric Martindale on August 29th, 2008 at 7:12 pm:

Highway of Life wrote:
Secondly, do not compare blog software with bulletin board software. They are inherently different. Just as CMS software cannot be compared to blog or bulletin board software. There are so many intricacies and differences between these software that people do not understand.

No, there is no reason we must separate the two types of software – as I said, the core focus of both a blog and a forum is to provide a platform where a community can share ideas and interact with one another. A blog is essentially a platform for moderated discussion, where topics can only be created by specific people (the authors).

Look at this blog post, for example – we have an original topic, and many replies that carry on a conversation between the original poster and the audience. I think this is more than enough of an example of my point.

I feel that many improvements were missed in phpBB3 because of the perspective that forums are not part of the new social media revolution that’s been going on for the past few years. As I’ve covered in previous blog posts, forums are social, but we need a few things to really get forum platforms up-to-date.

I don’t have a reference post, but I read that phpBB doesn’t include a large list of features because they aim to be a base platform that can be expanded by MODs – whether or not I agree with that I won’t say, but things like accessibility should be considered. Seriously, how do you expect a user to remember “http://mydomain.com/viewtopic.php?t=1337” as an URL? Or how about profile URLs:

http://mydomain.com/memberlist.php?mode=viewprofile&u=153860

That’s impossible for the user to remember! Why not:

http://mydomain.com/user/Membername ?

That is an example of keeping the user in mind. It’s little things like that – sure, we can store preferences in the URL to override a user setting – but make it so the board doesn’t get penalized by search engines for those crazy URLs. And besides, having that sort of URL structure is good for SEO anyway!

Posted by Eric Martindale on August 29th, 2008 at 7:27 pm:

Also, since there isn’t an edit function:

Take a look at what the people behind WordPress are doing: they’re building a bulletin board software that is actually lightweight with features being added by plugins: bbPress.

Did phpBB lose track of its original goals? It seems like they straddle the fence when it comes to the MOD community and features that’re included in the base software.

Go take a glance at any of the new communities out there (Last.fm for example) and you’ll see that because they’ve designed the whole site around the user experience, they’ve already got a large portion fo their SEO work done.

Just some food for thought.

Posted by grantgrant on August 30th, 2008 at 10:38 pm:

Eric: I enjoy reading your post and largely agree with your points. While I am still debating using phpbb3 or bbpress or vbulletine (more support) or even “lefora” (btw, they are good, but I want to host my own forum with ability to fully customize including ads in the future etc.), I noticed you have done a great job on URL structure for http://www.roleplaygateway.com/forum.html; how can I customize just like that.

I am not a programmer but familiar with SEO. I would love to learn your take on using either bbpress/phpbb3 or vbulletine (I tried vanilla a bit but its support is just not there, and community is less robust). My key interest is to be able to customize CSS (through hiring on elance or whatnot), add social plug in on regular basis to the forum community I will be starting.

Thanks!

Posted by Anderson on September 1st, 2008 at 11:02 am:

it makes a test there, it creates a forum and it makes a topic in it, later you it creates another one fórum and it moves the topic for the new forum, the URL of the topic goes to change when you to move the topic

exemple:

viewtopic.php?f=135&t=1520
viewtopic.php?f=4&t=1520
viewtopic.php?f=87&t=1520

Posted by jacob on September 1st, 2008 at 11:58 am:

As a new forum owner it’s very disheartening to read something like:

Highway of Life wrote:
A blog has to be searchable as it contains content. Many forums contain simply chatter which search engines would view as mostly meaningless.

Wtf? Really? Damn I should have returned my vBulletin license. The least phpbb could do it take a lesson or two. At least stride in the proper direction, even if the strides are small.

Posted by jacob on September 1st, 2008 at 12:01 pm:

As an added note, I’m curious why this install of wordpress was configured to use mod rewrite and friendly URLs instead of the default: http://www.site.com/?p=123.

Guess friendly URLs are desired… from all but phpbb owners? Pfft.

Posted by dcz on September 2nd, 2008 at 8:11 am:

Great post again Highway of Life, I just wanted to mention our latest thoughts about this : http://www.phpbb.com/community/viewtopic.php?f=64&t=1134585

😉

Posted by Anderson on September 3rd, 2008 at 8:19 am:

April a quarrel on phpBB3 its images in

http://www.phpbb.com/community/viewtopic.php?f=64&t=1169105

Posted by Eric Martindale on September 4th, 2008 at 4:59 pm:

@grantgrant – There’s some serious overhauls to phpBB’s code that need to be done to achieve this structure. A good place to start is the phpBB SEO package, which gives you a good starting point for mandatory tweaks (page titles, URL structure).

You bring up another point about SEO that I think phpBB3 missed: social media optimization. If you build a great website that makes it easy for a community member to share great content, you get a lot of your SEO work taken care of automatically: if members are able to freely share content, you are building organic links and relationships, which are the most valuable portion of the link building process.

Feel free to contact me (see my website for information) if you need some help. I’ll be more than willing to lend you a hand.

Posted by mrGTB on September 4th, 2008 at 6:27 pm:

I do agree that “Dynamic URLs” get index the same way Readable URL’s do. But to be honest your missing the point really, Readable URL’s are still better for SEO, because they contain Important Keywords which help a lot more. For example if I create a page about Discus Fish, and have a readable URL’s page called “discus_fish.html”. That is much better for SEO than a Dynamic URL that might say “thread_273645.php”. For obvious reasons. There is nothing in that title that a search engine can use to re-inforce it with the content on the page.

Posted by grantgrant on September 10th, 2008 at 10:05 pm:

@Eric Martindale: Thanks Eric – I am still setting up my fitness forum but you are welcome to visit.

@Anderson: I also want to acknowledge the excellent work the folks at phpbb has done. I know it takes a lot of time and energy to get it right but you are on the way.

Posted by Anderson on September 14th, 2008 at 1:51 pm:

phpBB3 seo – link move error in google

link -> http://www.phpbb.com/community/viewtopic.php?f=64&t=1170945

Posted by Computerontopic on December 3rd, 2008 at 8:25 pm:

I read that google made over 400 changes in their algorithm in 2007. How does one even manage to keep up with SEO these days?

Posted by james on December 19th, 2008 at 10:15 am:

managing seo rules and algorithm is very tough task search engines maximum Google quite frequently changes there algorithm these days but we have to be up to date if we have to survive in top ranks !!

Posted by Webhead on December 25th, 2008 at 4:44 pm:

Very insightful article. Although SEO is important I think too much SEO can have negative results as well. A site that appears to be perfectly optimized may be easily targeted by the search engines and penalized just the same as a site that uses old techniques like keyword stuffing. Today’s search engines seem to focus on natural rankings and thus attempt to rank pages by human usage not bot usage. With proper link building and well written unique articles a site will rank very well with simple white hat techniques.

Posted by Linas Van on January 5th, 2009 at 8:48 pm:

thank you, it was good to read about HTTP Requests

Posted by Halil Özgür on September 4th, 2009 at 9:37 am:

Highway of Life wrote:
Many people think that SEO is basically using or implementing Human Readable URLs. However, this is not encompassing of “SEO” at all, it is one small part of SEO, and is generally grossly misunderstood.

Not that small. You are underestimating. As it’s implied from your other mentionings, you seem to underestimate/misunderstand the whole topic of SEO.

Highway of Life wrote:
You have to remember that blog software is used by bloggers to have content that would be beneficial for users and search engines, while bulletin board software generally contains mostly meaningless chatter that really should not be indexed at all.

Now you:

Have ( tried to ) reduced the whole idea of forums to “meaningless chatter”. But rest assured that they are nowhere near to be reduced by you, me or a thousand of us.
Seem to be completely unaware of the fact that many top visited ( and thus money-making ) sites are just forums. Go ahead and take a look at Alexa, Google Trends and such tools. Even a few of Google searches that result in forum results being first would suffice. We have many examples here in Turkey, for example.

Sorry, but I’m unable to thank you for such irrelevant, misleading and making-me-go-away-from-phpBB ramblings.

Posted by search engine optimization seo on September 10th, 2009 at 2:16 pm:

phpBB3 seo looks interesting. I guess this would help in improving traffic.

Posted by Black Hawk on September 20th, 2009 at 3:59 pm:

at least this has been taken into account when developing and using phpbb … and i think there is no perfect ways to be 100% SEO perfect. Good forum with good topics is the key to attract more users.

Posted by Skinny Vinny on June 28th, 2010 at 4:24 pm:

Every time someone brings up seo for phpbb, namely the furls (friendly urls), someone points to this. I’ve seen it enough that I have to reply lol.

Briefly, to the seo is dead remarks: seo evolves. Google wants you to focus on the user, not serps, but there will always be white hat ways to optimize pages. The strategies of 10 years ago don’t work today, and todays strategies will be outdated 10 years from now. Seo is not a one time effort: it’s ongoing. What’s beneficial one day can even become detrimental the next (page sculpting, for example).

The typical reply around here has been that ‘google can and will index your pages with the dynamic urls’. This is true, however seo is not about simply being indexed: it’s about improving the pages position is the serps!

Using phpbb.com itself as an example is flawed: it’s a popular site that bots crawl frequently. Smaller and newer boards can not expect the same activity.

There is absolutely a benefit to using the furls! The gain here is that when a topics furl is posted to other forums, blogs, etc and made into a link (like phpbbs automatic/magic urls through make_clickable), you’ve made the pages furl, ideally with topic title (properly crafted with targeted keywords) a textual part of that link, which is a huge step towards seo. Arguably perhaps the greatest automatic off page seo one can do. Yes, google also likes to look at the surrounding text, but it is still a large part of googles over 200 factors. So yes, great content is the key to naturaly gaining these inbound links. The furl itself is of little value without these links to the content.

The opinion that bbs are typically meaningless chatter is worthless. Many topics are often spot on for a specific subject, and it’s these pages that are linked to and sought to improve serp position. Forums are often in results for more specific searches (refereed to as long tails): seo for a bb is about rising to the top for these types of searches. A furl, do to the links to the page, can mean the difference between pages 1 or 2 vs pages 20 or 30 of these long tail searches. Also, google doesn’t care if its a bb, a blog, or some cms. A page is a page is a page.

Duplicate content is a major concern. Nobody is arguing that, but resolving it is not any kind of final solution. You can make use of the canonical meta tag, but it simply helps and can not be relied upon. You must establish a standard url for each page, and your request_uri needs to be checked. 301 to that pages standard url.

It’s not my intent to advice how to handle any individual bbs seo here, but this common (almost anti-seo) reply we see over and over again is poor. I agree that there’s no one-fits-all solution for phpbb optimization, but the standard ‘furls are a waste of time’ opinion and reply is completely wrong.

Instead of arguing if there’s a benefit to a furl… can anyone suggest what harm there is in it (assuming 301s are in place to address duplicate content)?

Final remark: is it amusing to anyone else that this blog, like most, uses a furl?

Posted by Jam on August 5th, 2010 at 7:03 am:

dear phpbb,
i know that almost cms do the friendly url for their cms so why phpbb doesn’t care it ????

Posted by Wadie on August 8th, 2010 at 8:14 am:

Brilliant Article !
Thanks David (Highway of Life) 😉

Commenting is disabled for this blog post