ScrapeBrokers SEO Pack Guide
- How to use our packages -
Dear customer,
I would like to take this opportunity to once again thank you for your purchase. You will find great
results with ease and reliability within this package.
I read a lot of misguided information everyday about configurations for Scrapebox. I wrote this guide
to break some common misconceptions about Scrapebox and help you optimize your working to
achieve the greatest potential. I have done my best to write this guide with as little technical jargon
as possible. However, this is not a guide to be used lightly.
If you are new to Scrapebox, then I must highly recommend reading this How-To guide from start to
finish before starting your very first link building campaign. Enjoy!
(c) ScrapeBrokers 2011 – 2013 http://www.scrapebrokers.com
Page 1
Table of Contents
1. SEO Pack Overview..........................................................................................................pg 3
2. Configure Computer and Networking.............................................................................pg 4
3. Configure Scrapebox Settings & Proxies.........................................................................pg 6
3.1 ScrapeBox Settings Explained.......................................................................................pg 7
3.2 Proxies............................................................................................................................pg 8
4. Your 1
st
ScrapeBox Project.............................................................................................pg 10
4.1 Creating Project Directory...........................................................................................pg 10
4.2 Create Websites File.....................................................................................................pg 10
4.3 Create Comments file...................................................................................................pg 11
4.4 Start Posting.................................................................................................................pg 12
4.5 Posting Trackbacks.......................................................................................................pg 13
4.6 Link Checking...............................................................................................................pg 14
5. Powerful Backlinking Strategy using ScrapeBox Auto Approve Lists..........................pg 15
6. Tips on working with lists..............................................................................................pg 17
7. Tips for using anchor texts.............................................................................................pg 17
8. How to avoid being flagged by Google updates or Sandbox........................................pg 18
9. Troubleshooting and Help..............................................................................................pg 20
10. Final Words...................................................................................................................pg 20
Apendix 1 – Tweaking the operating system....................................................................pg 21
Page 2
1. SEO Pack Overview
In this package you will find a series of folders and files ready for use with Scrapebox. I have created
a directory structure to help keep your link building campaign highly organized to prevent mistakes
and frustration.
Overview of your just purchased product:
1. AutoApprove_MainList.txt
Latest Auto Approve List that contains all auto approve blogs. Blast to this list to get the best
link exposure. Do not blast the money/main/principal site but blast the other sites you created
(sites that link to your money site)
2. AutoApprove_EDU_List.txt
Latest list with .edu and .gov auto approve blogs. This list can be used to blast new websites
and money sites but do it slowly and balance the blast to multiple urls. Also make sure you
select good matching anchor texts.
3. AutoApprove_DoFollow_List.txt
This list contains only DoFollow auto approve blogs. Use this list to increase PR of your sites.
You should remove duplicate domains and blast your money site.
4. AutoApprove_DoFollowHighPR.txt
Premium list of all the dofollow blog pages that have a PR higher than 1. If you blast to this list
the PR of your sites will definetly increase.
5. AutoApprove_HighPRList.txt
All blog pages that have a PR higher than 1 are collected on this list. If you prefer to send your
links only to high PR sites use this list.
6. AutoApprove_Trackbacks.txt
Trackbacks are a great and easy way to obtain backlinks from a site and are not abused as
comments are. This list contain blogs that accept trackbacks.
7. AutoApprove_TrackbacksHighPR.txt
This list contains only blogs that accept trackbacks and have a PR higher than 1.
8. Manually_DoFollowHighPRLowOBL.txt
Premium list with high pr blogs that you need to comment manually to obtain a backlink. This
Page 3
list have very low obl and are powerful. PR Up to 7.
9. CommentsByNiche
Find inside a series of already spunned comments for over 20 different niches. If your niche is
not there use Ultimate Comment Scrapper program to grab yourself a fresh list of comments.
10.Goodies
Here are added together some files you can use to enrich your seo campaigns. Generic
comments, generic keywords, rapid indexer lists, bad words lists and two lists with freshly
generated 100000 emails and 100000 names.
11. SEO LinkTools
We grabbed together several seo tools you should use to enchance your productivity. To name
a few: SB DoFollow Checker, Ultimate Comment Scrapper, Sick Link Checker etc. This tools are
free to use for all our customers.
12.SEO Lists
Along with auto approve lists we offer seo lists as well. Article Directories, bookmarking sites,
video sharing, press release, rss sites etc. There are over 700 working wiki sites, over 200
article directories etc.
13. SlowPoster Lists
Slow Poster Lists contain auto approve blogs that are protecting posting using a captcha code.
If you enable slow poster and add a decatcha service in your scrapebox you can post on this
blogs as well. If you use a program like CaptchaSniper you can post faster and best of all for
free.
14.
ScrapeBrokers Guide
This guide.
2. Configuring your Computer and Network
There are a few steps that should be accomplished in setting your computer and network, before we
even touch Scrapebox.
2.1 Preparing your network connection
If you are using Scrapebox from your home PC or even high speed VPS/Dedicated server there are
limitation in place that restrict Scrapebox from using it's maximum potential.
Page 4
When Scrapebox begins posting your new Auto Approve list, it instantly opens up multiple connections
from your network to the internet. The amount of connections it opens can range from 1 to 500,
defined in the SB Maximum Connections settings. Incidentally, if you were to set your SB installation
to attempt 500 connections on an unoptimized network you would almost immediately crush your own
network.
On top of the hardware level, you have limitations in your own Operating System. By default Windows
Vista, 7 and Server 2008 disable some resources that we can use to help improve how our network
connection handles HTTP Requests, Packet Processing, Memory and CPU management, compression
and various other goodies. It is also designed by default to operate optimally for normal network
applications.
If you want to do SEO and also to use your computer at other activities without having to wait till
every seo campaign is done so you can use the computer at decent speed I want to tell you about
Virtual Private Servers (VPS).
2.2 Several VPS Advantages
•
You can run the tools 24h/24h without having to worry about keeping your desktop open.
•
You do not have to pay hundreds of dollars or even thousands upfront to buy a super
computer for your marketing needs. A decent computer at home and the rent of $50 –
$100 monthly for a vPS can do the job.
•
If there are resources you do not use you can simply share accounts on the VPS to other
marketers and split costs.
•
High Internet speed is also a great advantage of Virtual Private Servers (VPS) as hosting
companies are connected to internet through high speed pipes.
•
You don’t have to buy licenses for each of your staff around the globe. Simply create an
account on the VPS for each worker and they will be able to connect in few seconds having
the tools handy. Also you can easily monitor their work.
•
Do not keep your main computer or laptop busy. While all your internet marketing software
runs on the VPS your main computer can be used to daily activities.
Page 5
•
If something goes wrong you can simply restore the VPS to initial state or a previous
backup with few clicks of mouse. Using a file sharing service like DropBox will keep your
files safe and best of it: is FREE.
Using Scrapebox on a machine dedicated for use is highly recommended. While SB actually does not
consume a signifcant amount of bandwidth short term, it will generate thousands of gigabytes of data
over extended use. The main drawback is the amount of connections that are constantly active,
closing and reopening. Luckily, Scrapebox is very light on resource usage and can be run from a basic
machine with at least 1GB memory and 2.26ghz processor.
Another recommendation is to keep your dedicated Scrapebox machine on a seperate router/switch
then the one you share with your entire network. Due to the nature of connections Scrapebox makes,
it easily loads router processing to very high levels. Reducing the number of connections on a single
router/switch is recommended to insure high performance and allow for easily reseting the router
when an overload should occur.
At the end of this guide you will find an advanced tutorial on how to tweak your networking settings
for any Windows operating system that can bust your success rate and speed with over 300-400%.
3 Configure Scrapebox Settings & Proxies
If you are a regular Scrapebox user you know that the success of your harvesting links and posting
comments depends a lot on scrapebox connection settings and quality of proxies.
When you set them
to HIGH you get a lot of failed posts but the job is done fast while you set them to LOW you get
better results but the job it takes forever to complete.
I will try to explain everything you should know about this settings and how you can tweak ScrapeBox
to work best with your configuration.
3.1 ScrapeBox Settings Explained
Page 6
The Typical ScrapeBox Connection Settings you should use:
•
Fast Poster:less then 50 connections (20 or less is ideal) 90 sec Timeouts
•
Link Checker: less then 50 connections 90 sec Timeouts
I use this scrapeBox connection settings with my 100 mbits DEDICATED Server:
•
Link checker 100 – 90 sec timeout
•
Fast Poster – 200 – 50 sec timeout
•
Proxy Harvester – 100 – 15 sec timeout
•
Url Harvester – 75 per engine, I only ran 1 engine per scrapebox instance.
To get to the scrapebox connection settings, Timeouts, and Randomize go to the top menus in Scrapebox:
Settings -> Adjust Maximum Connections
Settings -> Adjust Timeout Settings
Settings -> Randomize Comment Post Blogs List
Maximum Connections
For Auto Approve lists we will be using the Fast Poster option. We will need to set the amount of
connections to prevent issues while posting. If you have followed the optional but highly recommened
step of Configuring your Network for Scrapebox you can set this option higher.
•
10 Private Proxies (w/ Computer Configuration) on 30MBps/5MBps connection you can set this
at 50 connections.
•
10 Private Proxies (w/o Computer Configuration) on 30MBps/5MBps connection you should set
this no higher than 20.
Page 7
To make this change click "Settings > Maximum Connections" and adjust the slider bar under Fast
Poster to a range that is compatible with your system configuration and network speed. This may
require some testing over time to find your optimum setting.
Timeout Settings
This setting will tell Scrapebox how long it should wait before giving up it's attempt at resolving the
website host. We need to set this option at the maximum number for two reasons:
•
Allow more time to resolve web host and increase success rate
•
Reduce the amount of load on your network.
To make this change click "Settings > Timeout Settings" and move the slider bar under Fast
Commenter all the way to the right (90 Seconds). Click "Apply"
Slow and Manual Poster Blog Links
Update your “Slow and Manual Poster Blog Links” settings found under Settings menu. Max it out to
4096 KB. It is not documented but the Fast Poster also uses this setting. Now that blogs that use
videos, complicated plugins, rich media content or have a lot of comments may easily exceed 2-3Mb
so the max 4Mb may be ok. As a test just go to your favorite blog and save the webpage; Select the
html and the content folder (images, js) and see how much space use. You will be surprised and will
understand that webpages are heavy and consume a lot of bandwidth.
Absolutely make sure you choose to “Randomize comment poster blogs list” under settings.Not
choosing to Randomize your comment poster blogs list will cause you to post to all the urls from the
same domain all at once, or in rapid succession. This can seem like a DDOS attack to the web server
of the domain. At which point it will shut you down and your comments will won’t work.
3.2 Proxies
First, I must stress the importance of using Private Proxies while using Scrapebox. While public
proxies can do the job, you will not see maximum performance.
Page 8
Proxies are very important for a good posting success rate so you should either be using 10 or more
private proxies or be using 10 or more public proxies that have a low latency and pass the IP test in
scrapebox. Private proxies will have a significantly higher success rate.
If you are not using Private Proxies, you are going to find lower success rates using this package. It's
essential you spend the extra money on a good set of 10 Private Proxies.
Here are two Private Proxy providers at reasonable price SquidProxies.com, Proxy-Hub.com.
How to use free shared proxies with ScrapeBox
Unfortunately, if you must use public proxies there are steps on gathering "acceptable" proxies for use
with Scrapebox. This is a simple but time consuming process on finding proxies that we can use for
posting.
First we will want to turn on the new proxy harvester within Scrapebox. To enable this click "Options
> Check Use New Proxy Harvester"
Now in the bottom left corner of Scrapebox you will find the proxy section. Let's go ahead and click
"Manage" to open the proxy harvesting application. Since we are using these proxies for posting and
not harvesting we do not need to check if they are blocked by Google. You will want to check the box
at the bottom that says "Skip Google".
Now you may add your own proxy sources or use the default sources supplied by Scrapebox. For now
we will use the default sources which will already be enabled. So you will go ahead and click "Start"
and wait between 1-3 hours while this finishes. After your extended wait, we will need to filter out the
bad proxies/high latency and rerun the
test. Now you will need to click "Clean Up" and choose the option "Filter Proxies". A small window will
appear allowing you to adjust the latency to remove. Let's move the slider all the way to the right
(10,000ms) and click apply.
With your new clean list lets re-run the test again. After your second run you should repeat the same
process to clean up proxies but this time use (5,000ms) as the limit. Running a third and final test and
removing under (2,500ms) is highly recommended to insure a decent success rate while posting.
After you end up with a list of working proxies load it again and test it , remove the failed one and
Page 9
repeat this step 4-5 times till no more proxies are shown as failed. This way you basically have a good
working proxies not as good as private ones but good enough.
Finally, lets export your new proxy list to file and then transfer them to your main window.
4 Your 1
st
ScrapeBox Project
4.1 Creating Project Directory
First and crucial step to maintain good organization is to create a project for your link building
campaign. Ideally we will create a folder called "YourProjectName" inside your Scrapebox folder under
"Projects". Inside this folder we will create the following directories:
"YourProjectName" Folder
|_ "Posted" Folder
|_ "Failed" Folder
|_ "Captcha" Folder
|_ "Found" Folder
|_ "Errors" Folder
|_ "NotFound" Folder
|_ "Poster Files" Folder
4.2 Create Websites Files
We need to create a file containing the list of websites/pages you will link to in each comment. Inside
your "Poster Files" folder create a new text document and name it "Websites.txt". Inside this new text
document you will place on each individual line the URL to the website or webpage you would like
Scrapebox to use inside the "Website" field on each blog.
I use a tweaked websites files that have anocher texts as well so basically the Names files is useless
for me.
http://scrapebrokers.com/ {this page|this site|scrapebrokers.com|www.scrapebrokers.com|
scrapebrokers.com|www.scrapebrokers.com|click here|this website|scrapebrokers|this
Page 10
company|this page|this guys|website|here|download|company|scrapebox autoapprove
trackback list|do follow|faster poster|scrapebox high pr auto approve|scrapebox auto approve|
proxy sources for scrapebox|scrapebox proxy sources|scrape urls for particular domain|
scrapebox proxy source|find proxies for scrapebox|proxies for scrapebox|proxy lists|scrapebox
aa|autoapprove faster poster download}
http://scrapebrokers.com/four-blackhat-methods-to-build-backlinks {this page|this site|
scrapebrokers.com|www.scrapebrokers.com|scrapebrokers.com|www.scrapebrokers.com|click
here|this website|scrapebrokers|this company|this page|this guys|website|here|download|
company|trusted source|source|this source|link pyramid|edu backlinks|backlink building
services|.edu link building|buy edu links|backlink building service|backlink service|seo link
building|backlink building seo services|buy backlinks|backlink building services|smarter
submit|backlink building|backlink service|seo link building}
As you can see I mix generic keywords like this page, this website with gold anchor texts in order to
avoid being penalized for over optimization and to keep my backlinking profile natural. More details on
this in the captcher called How to stay safe from Google updates.
4.3 Create Comments File
The next file you will need to succesfully use Scrapebox is a list of comments to be used while posting
to your new list. This is obviously a neccesary and very important file.
Unfortunately, Scrapebox does not have a feature to generate comments. Luckily, with your purchase
you will receive unique comments ready for over 20 niches that you can simply plugin in ScrapeBox
and start posting. Just copy the "Comments.txt" file related to your niche from your product download
into your "Poster Files" folder.
If you can't find comments for niche there then you should use Ultimate Comment Scraper software,
included in this package to scrape yourself a great list of nice comments. Using unique comments
reduces your footprint and makes it harder for other people to find and harvest all of your back links.
You may create your own list of comments to be used while posting to blogs or you can find free(or
Paid) comment lists on the Internet.
Page 11
Searching Google for "Scrapebox Comment Lists" will bring you back some helpful resources. I advice
you to use a comments file that is appropiate for your niche. If you do not have one use Ultimate
Comment Scrapper included in this package to generate one for your niche.
4.4 Start Posting
In order to get the most backlinks using an autoapprove list and scrapebox you need to follow
some easy but strict steps. I will teach you the exact method Im using it now to get over 85% posting
rate with our scrapebox auto approve lists. So before jumping the gun that the list is poor make sure
you are following this method. It is tested and improved by me for over one year now since I’m doing
daily link blasts and pack autoapprove lists.
The Comment Poster section on the bottom right of the application is where the magic happens. In
this section I will explain the best methods of posting using this package while mainting organization
within your project.
In the bottom right 'Comment Poster' section you will see 5 input fields for Name, Emails,
Websites, Comments and Blog List. You will need to manually load each file into the correct field. Click
the "Open" button next to each field and load the corresponding file from your "Poster Files" folder.
After completing this, you should have your Names, Emails, Websites & Comments files loaded.
Now we need to load our "Blog Lists" for commenting. You may load one of the Auto Approve lists
supplied in this package or if you went a step further and split the Auto Approve lists
you can use one
of these files.
After you have loaded all five necessary files Scrapebox requires for posting. Make sure "Fast Poster"
option is selected in the top left corner of the "Comment Poster" section. Click "Start Posting".
Depending on the size of your blog list, Internet connection speed, home computer configuration, and
router settings. This can take anywhere from 3 hours to 24+ hours. If you are experiencing long
posting times, I suggest reading the article mentioned under "Optimizing Your Network for Scrapebox"
in the "Configuring your computer for use with Scrapebox" chapter.
Page 12
After your posting session ends, its a good idea to export your session for further analysis and rerun
failed posts. Click "Export" on the bottom window, you will now see a few options we have. What we
want to do is export our 'Posted, Failed, and Captcha' entries. Start with "Export 'Posted' entries" save
the file into your created "Posted" folder. Good idea to use a date convention with naming eg; Posted-
MO-DAY-YEAR.txt. Repeat the same steps for Failed and Captcha entries, saving the output into their
respective folders.
Use ScrapeBox Link Checker Free addon that can be downloaded from the official website at
.Personally I find it more faster than the Link Checker included in ScrapeBox. First load the
listname_posted.txt entries and find the found links. Export them at listname_posted_found.txt and
then load the listname_failed.txt entries. The Found one export from the failed list export them as
listname_failed_found.txt and not found ones at listaned_failed_notfound.txt
•
Load all the listname_failed_noutfound.txt files and post again on this lists. You will normally find
25% or more of the entries are successful on a second try. Don’t forget to shuffle this second list as
well! A third run will result in even more successes, but the count is usually to low to make the time
and effort worth while.
Don’t trust the “failed” results in the bottom right column of
. ScrapeBox reports fails with
302 (redirect) errors on posting as well at 500 (server error) errors. There seems to be new plugins
on blogs lately that report all posts as 404 (not found) errors even though the post was successful. I
find that in most cases, the link is really posted. You can verify this for yourself by exporting the
“failed entries” in the comment poster after fast posting and load this failed list back into ScrapeBox
and Check Links.
As a rule of thumb, I would suggest waiting 24 hours to repost on Failed entries. The main cause of
failed entries are delays from the blog/websites host not responding to Scrapebox in time.
4.5 Trackbacks
Posting Trackbacks is a great way to get your link on a website without the convential blog comment.
Page 13
A track back tells the blog/website owner that you are linking to their site and that they should
reciprocate the link.
Using trackbacks is as simple as following the steps above, however you only need the
Websites, Comments and Blog list to begin posting. Once you have these three files loaded up, go
ahead and click Start Posting. Trackbacks can take a bit longer to post to than normal blog
commenting but work about the same. Afterwords, you can export your success (and faileds for later
use) then run the results through the link checker.
4.6 Link Checking
After you have finished posting to each list provided in this package you can check to see your
links. This step is entirely optional and is very resource intensive on your network. You can
expect to wait 24-36 hours to check 250,000 URLs.
Another good way to see your progress is to use MajesticSEO.com and enter your domain information.
It will display backlinks found pointing to your site. They will start showing up usually 24-48 hours
after completing your link building campaign.
To use the link checker is a very simple process. Inside the 'Comment Poster' section you would
need to load your Websites and Blog list files into their respective fields. Then click the "Check
Links" option. Here a new window will open with a list of URL's from your blog file. It's as simple as
clicking "Start" and waiting. After the program has finished you may click "Start" again to rerun any
failed entries. Do not worry as this will store your previously checked links and only retry URLs
that returned errors. Once this is completed make sure to use the "Export Links" option and
save your Found, Not Found, and Error files into their respective folders.
Personally I try to avoid using ScrapeBox inbuilt link checker for several reasons like the fact that I
keep busy an instance of Scrapebox and the other is that I find it to work quite slower than the stand
alone scrapebox link checker you will find in the Seo tools folder from our package.
Page 14
5 Powerful Backlinking Strategy using ScrapeBox Auto Approve Lists
Its highly recommended to create a plan that will spread your link building campaign over
several weeks, while mainting consistancy.
Let's say for example we decide to use this new package to build links on a new domain. We
create a short plan that will outline our goals and timeline.
An example link building campaign:
Tier 1 – backlinks to your money/main site you can use
•
EDU AutoApprove List and EDU Auto Approve Trackbacks (Remove Duplicate Domains)
•
HighPR Dofollow AA List and AA Trackbacks (Remove Duplicate Domains)
•
Manually High Pr Dofollow
•
Slow Poster High PR (Remove Duplicate Domains)
•
SEO Lists (WIKI, Articles ,Pligg )
Tier 2 – (backlinks to tier 1 backlinks) you may use
•
EDU AutoApprove List and EDU Auto Approve Trackbacks (No Need To Remove Duplicate Domains)
•
HighPR Dofollow AA List and AA Trackbacks (No Need To Remove Duplicate Domains)
•
Manually High Pr Dofollow
•
Slow Poster High PR (No Need To Remove Duplicate Domains)
•
SEO Lists (WIKI, Articles ,Pligg )
•
Full Auto Approve DoFollow List
Tier 3 – (backlinks to tier 2 backlinks) you may use
•
EDU AutoApprove List and EDU AA Trackbacks (No Need To Remove Dupl Domains)
•
HighPR Dofollow AA List and AA Trackbacks (No Need To Remove Duplicate Domains)
•
Manually High Pr Dofollow
•
Slow Poster High PR (No Need To Remove Duplicate Domains)
•
SEO Lists (WIKI, Articles ,Pligg )
•
Full Auto Approve DoFollow List
•
Full Auto Approve Main List
•
Full Auto Approve Trackbacks List
Page 15
Another importat aspect of link building is backlink velocity. To explain it in short is that you need to
be constant or ot increase of backlinks number you build. If you create one day 50000 backlinks and
then next month you do only 100 or so is not going to be seen with good eyes by google. By now I'm
sure you have already caught on to this example of a safe, secure and reliable link building campaign.
Each week we will progressivly increase the number of links we build throughout our campaign.
Week 1
We start off with a small burst of incoming links from various blogs. Run a nice mix of Auto Approve
and some well placed contextual comments on Moderated Blogs. Keep this number under a few
hundred. As the week progresses we can check MajesticSEO or Yahoo! Site Explorer if our links are
being picked up.
Week 2
We continue our link building campaign with another blast of links. Once again lets keep it in line with
the number of links we built our first week.
Week 3
By week 3 we can begin to progressivly increase the number of links we are building. Perhaps even
sprinkle in a few .EDU links (These can be very powerful). Continue to check your progress using
either MajesticSEO, Yahoo! or Google Webmaster Tools.
Week 4 and so on...
It's a good idea to eventually set a limit on the number of links you build each week. But do keep in
mind that consistency is key! As well is different sources of backlinks, don't just stop with this
package. Having incoming links from all sources of the web is a great way to guarantee top spots in
the SERPs.
To assist with this type of link building campaign, we are going to need to split our lists into
smaller more managable sizes to be used week to week. The next section will explain just how
simple it is to do that.
Page 16
6 Tips for Working with Lists
Splitting your lists into more easy to use sizes is a simple process done right inside the
Scrapebox application. It's also very helpful to keep a good maintained and organized setup
for your link building campaign.
Simple import your URL list using the "Import URL List" drop down box under "Manage Lists",
you can either use "Import and replace current list" or "Import and add to current list" if you
would like to append the file to your currently loaded URLs.
Once your list has loaded into the application we will now use the "Export URL List" drop
down box and select the "Export as Text (.txt) and split list" option. You will see a small
popup box allowing you to designate the number of entries per file. As a forewarning if you
select a low number on a large list, the application will export a large number of files into
your directory. Try to keep the number/ratio to a managable size.
When splitting your list for link building campaigns, a good idea is to randomize the list first
so you will recieve links from different domains instead of all the pages on a single domain.
This is easy as following the steps above, but instead of using export and split files you will
select "Export as Text (.txt) and randomize list". Then repeat the above options again to
export into split files.
7 Tips for choosing Anchor Texts
Anchor text is the bread and butter of your link building campaign. When Google or other
search engines crawl webpages looking for external links, they will take not of the text that links your
webpage or blog. Search engines will use this text to score your particular page or site for that
specific keyword in the search engine. You have a few options on setting what anchor text will be
used while commenting.
One method is to append a HREF URL link at the end of each comment in your "Comments.txt"
file. You would put at the end of each comment a HTML url link pointing to your site such as: <a
Page 17
href="http://www.YourSite.com/YourPage">AnchorTextHere</a>
While this is effective, you may find some blogs/websites disable HTML code inside their
comments. Meaning your backlink will not be created succesfully!
Another and recommended method is using the Name/Website that nearly all comment sections
contain on a blog. When you fill in this information and the link goes live, it will link your name as the
anchor text for the website your specified. Using Scrapebox we can easily set your anchor texts by
placing them inside the "Names.txt" file you load for use with commenting. Scrapebox will rotate the
anchor texts located inside this file, and when the comment appears on the site it will show as a link
back to your site.
The above method is a good way to link back to your site without having URL links inside the
comment text.
8 Avoid being flagged by Google Panda, Pinguin or Sandbox
The Google "Sandbox" is a term used loosely to designate when a site or blog has been penalized by
Google's Search Engine. It can either mean a partial penalty or a complete deindexing of content in
the search engine results. Many factors can take place in determining Google's decision to put your
website or blog into the "Sandbox". One notable effect many SEO professionals have found is that
inconsistent and aggresive link building can actually harm your main goal of moving higher in the
SERPS.
It is wise to follow some basic steps when using a link building package such as this to reduce the risk
of the unfortunate. A good plan for using this package is to look at it as a continual link building
campaign and not a single blast and go. This holds especially true for newer sites with a low domain
age.
Diversify your text anchors– If you check most of the websites that gain natural backlinks you will
see that have various anchors. And that’s normal coz when someone link your page will not do it with
your keywords in mind but with a text that is part of his article or your website name .etc Various
Page 18
anchor you will find are: www.Websitename.com , website.com, click here, read this, on this website
etc. So now update your text file with anchor text to make it look natural. Example for this url:
http://www.scrapebrokers.com/build-great-backlinks-in-2013-using-scrapebox-lists
{http://www.scrapebrokers.com|http://www.scrapebrokers.com|
http://scrapebrokers.com| scrapebrokers.com|scrapebrokers.com| ScrapeBrokers|
Scrapebrokers| this company| this website| this blog post|here| there| check here|
scrapebox list| scrapebox autoapprove list| scrapebox auto approve list|scrapebox
backlinks| backlinks guide}
A good percentage is to have around 30% general urls of your site with variations, 30% general
keywords and 40% related anchor keywords you want to rank for.
Remove duplicate domains when link to money site Google do not really like sitewide backlinks
or blogroll backlinks. Do not really know why but you should better remove duplicate domains if you
post auto approve directly to your money/main site.
Diversify your backlinks type
If you focus only on creating blog posts , blog backlinks or article
submission or bookmarking only will not look good (read unnatural) for google so most of the
backlinks will not be counted and if you build thousands you may even receive a backlinking unnatural
process and a penality. So mix your backlinking strategies adding wikis, web 2.0, bookmaring sites ,
video – very important you should add some youtube videos as well – google trust his own child
doesn’t ??! We just released a SEO Lists Package with over 3000 sites like article directories,
bookmarking, videos, forums, web 2.0 sites. Check
Do not forget social media.– Read
(see what anchor text I’ve used) on how this guy
demonstrated Google is using social media data from Twitter, Facebook, G+ to rank websites.So if you
do not have any of this accounts you should create them ASAP.
Quality Over Quantity Rule for Money Site Backlinks- You need to take care of what kind of
backlinks you are building to the money/main site. Too many with low quality hosted on same IP etc
can do more harm than good. That’s why you have to build your best bakclinks on Tier 1 – backlinks
to money site. Use High Pr, DoFollow, Edu sites + web 2.0 , guest posting and social media
Page 19
9 Troubleshooting and Help
Besides this guide I invite you to check our blog at
http://www.scrapebrokers.com/blog
http://forum.scrapebrokers.com
You will find many more tips related to scrapebox and seo
tools that you can benefit from.
If you have any questions or you simply need something explained more clearly don't hesitate to
contact us. Me, Dennis or anyone else from the friendly staff will get back to you ASAP.
http://www.scrapebrokers.com/support
10 Final words
This guide was written with the sole purpose in mind of offering quality information for all our
customers so you guys can get the most of your software and our products. I hope you have found
the information I've written here helpful and wish you great success in your link building campaigns.
In addition, you will always receive any and all future updates to this guide. Notification will be
sent to your email address.
Before closing this document and pack it in PDF I'd like to add several tips that crossed my mind and
you should think about it in your SEO campaigns
1 - Focus on authority as well. Think on websites that are authority sites in their niches and try to get
backlinks from them. Backlinks from sites like youtube, wikipedia, wallstreet journal, foreign policy and
other top sites will give you so much authority and boost your rankings. So if you mix auto approve
links with authority links you will boost your rankings
2 – Try to follow a natural link building profile. Besides using general keywords anchor texts along
with your keywords you should also try to keep a link velocity stream and be consistent in your seo
efforts.
That's all folk,
Now start getting your website that top #1 spot on Google! :)
Page 20
Appendix 1 - Tweaking the Operating System
Warning! This guide is written for Windows 7/Server 2008 OS only. Many of the changes made here
are for Windows 7 environments only.
USE THIS GUIDE AT YOUR OWN RISK. I will not be held responsible for any problems you cause by
following this guide. You are to proceed at your own risk. Make sure to backup when I tell you to do
so, in case you need to restore original settings. Warning!
To start you will need to download the latest version of SG TCP Optimizer available at
http://www.speedguide.net/downloads.php
After you have downloaded SG TCP Optimizer, go ahead and launch the application. There is no
installation necessary and will load immediately.
Let’s take a look at what you will see:
Page 21
By default, you will see a lot of options set as default or blank. We will now make the changes needed
to optimize your system for heavy Scrapebox usage.
Your first step is creating a BACKUP of your current settings. This can be done by going to
FILE > BACKUP CURRENT SETTINGS. Do this now before you make any changes!
After you have completed backing up your settings. Move on to selecting the proper Network Adapter
from the drop down box. Make sure to select the connection you use when Scrapebox is in use.
Once you have that done, go ahead and select the CUSTOM option on the lower right hand corner of
the application. You will now need to read through the following options and set the accordingly. I
have labeled my RECOMMENDED options in red color under each setting.
Some options may
require Router, Network Adapater and/or CPU support!
It is your responsibility to read this
information before changing any options.
General Settings
TCP Window Auto-Tuning
TCP Auto-Tuning enables TCP Window scaling by default and automatically tunes the TCP receive
window size for each individual connection based on the bandwidth delay product (BDP) and the rate
at which the application reads data from the connection, and no longer need to manually change
TcpWindowSize registry key value which applies to all connections.
Setting: Normal
Congestion Control Provider
By default, Vista and Windows 7 have CTCP turned off, it is only on by default under Server 2008.
Turning this option on can significantly increase throughput and packet loss recovery.
Setting: CTCP
TCP Chimney Offload
TCP chimney offload enables Windows to offload all TCP processing for a connection to a network
adapter. Offloads are initiated on a per-connection basis. Compared to task offload, TCP chimney
offload further reduces networking-related CPU overhead, enabling better overall system performance
by freeing up CPU time for other tasks.
Setting: Enabled
Receive-Side Scaling State
The receive-side scaling setting enables parallel processing of received packets on multiple processors,
Page 22
while avoiding packet reordering. It avoids packet reordering and separating packets into “flows”, and
using a single processor for processing all the packets for a given flow. Packets are separated into
flows by computing a hash value based on specific fields in each packet, and the resulting hash values
are used to select a processor for processing the flow. This approach ensures that all packets
belonging to a given TCP connection will be queued to the same processor, in the same order that
they were received by the network adapter.
Setting: Enable (if you have 2 more processor cores, otherwise set as default)
Note: For this setting to work your network adapter needs to support RSS. This can be enabled by
going into your network adapter’s driver configuration. Under the Advanced tab if there is an option
for Receive Side Scaling you need to set this to Enabled.
Here you can see the setting:
Direct Cache Access (DCA)
Windows 7 and 2008 Server (but not Vista) add NETDMA 2.0 Direct cache access support. Direct
Cache Access (DCA) allows a capable I/O device, such as a network controller, to deliver data directly
into a CPU cache. The objective of DCA is to reduce memory latency and the memory bandwidth
requirement in high bandwidth (Gigabit) environments. DCA requires support from the I/O device,
Page 23
system chipset, and CPUs.
Setting: Enabled
Note: You will likely only see performance increase if using a Gigabit router with Cat-6 cables and
Gigabit Capable Network Card.
This setting requires support from your Network Card, System Chipset
and CPUs.
NetDMA (TCPA)
NetDMA enables support for advanced direct memory access. In essence, it provides the ability to
more efficiently move network data by minimizing CPU usage. NetDMA frees the CPU from handling
memory data transfers between network card data buffers and application buffers by using a DMA
engine.
Setting: Enabled
Note: Not available under Windows Vista
Time-to-Live (TTL)
Time-to-live (TTL) is a value in an Internet Protocol (IP) packet that tells a network router whether or
not the packet has been in the network too long and should be discarded. For a number of reasons,
packets may not get delivered to their destination in a reasonable length of time.
This setting needs to match the timeout setting provided in Scrapebox. We will change the timeout
setting for Fast Commenter to 90 Seconds later.
Setting: 90
ECN Capability
ECN is only effective in combination with AQM (Active Queue Management) router policy. It has more
noticeable effect on performance with interactive connections and HTTP requests, in the presense of
router congestion/packet loss. Its effect on bulk throughput with large TCP Window are less clear.
Setting: Enabled
Note: For this setting to work you router must support ECN. To check if your router supports ECN you
can run a test using Internet Explorer at:
http://www.microsoft.com/windows/usi…d/default.mspx
After the test completes, it will contain information about your internet connection and if ECN is
supported by your router.
Windows Scaling Heuristics
Windows Vista/7 has the ability to automatically change its own TCP Window auto-tuning behavior to
a more conservative state regardless of any user settings. It is possible for Windows to override the
Page 24
auto-tuning level even after an user sets their custom TCP auto-tuning level.
Setting: Disabled
Checksum Offloading
This setting allows for reducing CPU load by offloading some tasks required to maintain the TCP/IP
stack to the network card. Theoretically, Windows should automatically detect capable network
hardware.
Setting: Enabled
Window Scaling
The TCP window scale option is needed for efficient transfer of data when the bandwidth-delay
product is greater than 64K. For instance, if a T1 transmission line of 1.5Mbits/second was used over
a satellite link with a 513 millisecond Round Trip Time (RTT), the bandwidth-delay product is
(1500000*.513) = 769,500 bits or 96,188 bytes. Using a maximum buffer size of 64K only allows the
buffer to be filled to 68% of the theoretical maximum speed of 1.5Mbits/second, or 1.02 Mbit/s.
By using the window scale option, files can be transferred at nearly 1.5Mbit/second utilizing nearly all
of the available bandwidth.
Setting: Enabled
Timestamps
TCP is a symmetric protocol, allowing data to be sent at any time in either direction, and therefore
time stamp echoing may occur in either direction. For simplicity and symmetry, we specify that
timestamps always be sent and echoed in both directions. For efficiency, we combine the time stamp
and time stamp reply fields into a single TCP Timestamps Option.
In the case of Scrapebox usage this setting should be disabled. It creates an addition 12 byte
overhead for each packet being set and further congests the network with unnecessary information.
Setting: Disabled
Advanced Settings Tab
Internet Explorer Optimization
This setting has no affect on Scrapebox performance and should just be left as default. You can set
each field accordingly to improve IE performance.
MaxConnectionsPerServer Setting: 8
MaxConnectionsPer1_0Server Setting: 8
Page 25
Host Resolution Priority
This section improves DNS and host name resolution in general. It helps web pages load faster, and
has negligible effect on downloads. This will help increase the speed at which Scrapebox can load a
website for commenting.
Local Priority Setting: 4
Host Priority Setting: 5
DNS Priority Setting: 6
Netbt Priority Setting: 7
SynAttackProtect
This is a fairly undocumented feature in included in recent versions of Windows. This setting should
be set to default to reduce any unnecessary overhead while running Scrapebox.
Setting: Disabled
Network Memory Allocation
LargeSystemCache Setting: Enabled
Size Setting: Optimized
Quality of Service
QoS: NonBestEffortLimit Setting: 0
DNS Error Caching
Windows has built-in DNS (Domain Name System) caching, which caches resolved host names for
faster access and fewer DNS lookups. This is generally a great feature, with the only downside that
failed DNS lookups get cached by default as well. When a DNS lookup fails (due to temporary DNS
problems), Windows still caches the unsuccessful DNS query, and in turn fails to connect to a host
regardless of the fact that the DNS server might be able to handle your lookup seconds later. One can
manually flush failed DNS lookups by typing ipconfig /flushdns in Command prompt. Or you can
simply set the 3 values in the Optimizer to “0″, and it will set the relevant Registry settings.
This is a very important change for Scrapebox. When you are posting comments, Windows by default
keeps a record of all domains that are accessed by your network card. Even failed connections that
would work during your next run might be passed over as failed again even though the connection
was alive.
NegativeCacheTime Setting: 0
NetFailureCacheTime Setting: 0
NegativeSOACacheTime Setting: 0
Page 26
Gaming Tweak – Disable Nagle’s algorithm
These settings are related to Nagle’s algorithm. Best to leave these as default as they will not affect
our performance with Scrapebox positively.
Setting: Default
Dynamic Port Allocation
MaxUserPort Setting: 65535
TCP TimedWaitDelay Setting: 30
Finished! Go ahead and apply your settings and restart your machine when prompted.
Page 27