The SEO thing is not as bad as it seems.
Fair do's though. The stuff I am writing about I have yet to implement on my own Dolphin site. ;) Coming over from WordPress over the past few years a lot of work has gone into making WP fairly good with on page SEO, but to be fair, because it is such a popular platform the SE's know how WP works inside and out such that the GoogleGods have said it ain't worth spending time on shaping PageRank (doing this kind of stuff) because they already know how your site works. I wish it were true of Dolphin. ;)
The biggest single issue I see WRT Dolphin is duplicate content..
My site is nothing too special but many of the pages score a 96% on the Firefox plugin SEO Doctor. So, most everything is fine. My biggest issue with the on page stuff is the H1 and H2 tagging. If you are not using SEO Doctor get it and use it - it'll give you info that'd cost you a shedload of money if you paid an expert to look at your site. ;)
Meta tags are an issue if you are doing timewarp SEO using "SEO for Dummies" published in 1998. Nowadays, if it is easy, do it. If it ain't don't bother. Much more important is that your content pages enable SE's to retrieve a decent snippet to go in the search results. Dolphin is OK in this regard.
The biggy is, as noted, duplicate content and, in truth it is only an issue if your site is not well crawled by bots and so you are seeing loads of pages in SERPS of the same stuff. However, this is something with which we can deal.
I had thought that mucho coding would be needed, but no, not a bit of it. ;)
There are several routes to work on, all of which serve to tell SE's which pages to index.
Here they are, I am not going to go into great detail here, there are books on each of these topics and Google is your friend.
Alterations to templates - just add 'nofollow' tags on templated links that go to where you don't want 'em to go. For example, some folks don't want SE's to be indexing category or tags pages. You may want to block off member profile pages. That'll kill a load of duplicate content quite well.
Next look toward your .htaccess file - you can use this to block off access to whole parts of your site and kill scads of DC. Easy targets are WWW, .and suffixes such as / or /html
You can also block off chunks of sites where there are two types of URL that lead to the same place. Set .htaccess to filter with NOINDEX all the variants that you do not want. If you want to tidy up what SE's are already indexing then use 301 redirects to tell them 'not here - but here'.
Finally Google's Parameter tool enables you to similar and more to .htaccess in a manner that lets you take things step by step: http://googlewebmastercentral.blogspot.com/2009/10/new-parameter-handling-tool-helps-with.html With this tool you tell Google about how your site is built and where to go, and not go.
Oh, finally finally a good sitemap make a sitemap that only shows the pages you want indexing and guess what, those will be the pages that get indexed. ;)
Basically, you can slay all the duplicate content dragons without going into the Dolphin code. If you want, don't bother with the template stuff. You can do much the same using the other methods.