Posted by : Unknown Tuesday 21 May 2013

http://bestblackhatforum.com/Thread-INFO-SENuke-Private-or-Public-Proxies-Must-Read-To-Learn
http://www.warriorforum.com/adsense-ppc-seo-discussion-forum/394138-scrapebox-question-about-free-paid-proxies.html

This post will help the newbies around here. It isn't rocket science, and most of us probably know everything I am going to say. However, it may shave off some time for newbies, as spending hours to find 1 answer sucks. So hopefully this will spare some people the pain of paying for private proxies if they don't need to. It may help clarify where/when private proxies are needed. This will also teach you how to change your IP Address. If you need private proxies or not for SENuke, xRumer, Scrapebox, BookmarkingDemon, Article Marketing Robot, TrafficTravis, WikiBomber, or any other software. If your software isn't listed, then look at what I did list and use it as reference. For example, I didn't list GSER however it is a program just like Scrapebox, so it's concept behind it would be the same...

What are proxies?
Proxies are servers/computers that have an open port ready to receive a connection. Just how a website is open to receive a connection on port 80, a proxy is waiting for someone to connect on whatever port it is setup to use. There are some standard ports, but you can use whatever number you wanted (if your the one creating it, not just a user!). What they allow you to do is fake your IP address. You connect to the proxy, and the proxy connects to your destination. Your destination sees and uses the proxy/fake IP and sends the reply of data back to the proxy. In return, the proxy passes that data back to you. It's like a middle man. When you want to avoid detection, hide your identity, bypass filters, or bypass restrictions... you use a proxy. However, there are many kinds of proxies! Continue on...

What are the different kinds of proxies?
You have public proxies, and private proxies. For the scope of this guide, we wont dive any deeper into the types of those proxies, as in SOCKS or Gateways... Your just interested in the "normal" proxies. So we have public and private. Public proxies are open to the public, and don't have a password... hence why they are called public lol. There are a few common port numbers frequently used, but you don't need to worry about the port numbers. They don't really matter for your interest.. After looking at a few thousand proxies, you will know what you commonly get and what are uncommon. So if you need to know, then look it up on Google. This is confusing enough for beginners lol.

Private proxies usually have a password to use them, which means you don't have a ton of people on it. Usually it is only for you, and locked to your IP address (that's how mine were). When I changed my IP, I had to update it in the backoffice of the provider so my proxies would work for my new IP. They are generally a lot faster than public IPs, and last a lot longer, and are more reliable. Public proxies die often, and its easy to lose half your list in 12 hours! 

When to use public proxies?
When posting to mass sites using Scrapebox and xRumer. At least for your big blasts. For example, i will load up a 100k list and run it with around 1500 proxies. Once it's done, I will load my failed URLs up and run again. Usually the 2nd run is like the next day, and I reharvest and check my proxies so I have a fresh list again! You will still have a large amount of failed URLs, so don't use private proxies. If you really want to get what is left, then PR check the failed list. Pick a number to keep say PR3 and higher. Then check outbound links. 

Find a reasonable number of OBL and keep those. Then run through the SB analyzer. You shouldnt need to run them through the alive checker, because you should never post to a list in SB that hasn't been alive checked! EVER! That is just dumb. You can cut your lists in half sometimes, saves bandwidth, and when using public proxies you dont want to waste a good proxy on a dead site.. because next one you will have a good working site that will fail because you used a bad proxy. Work smarter, not harder! It makes no sense to try to attempt to post to dead sites, so lets get them the hell out of our lists first!

When to use Private Proxies?
Private proxies can be used in every situation! However, they have a lot lower success rate. When you have a list you have ran through, cleaned, scrubbed, filtered, and you have on badass list that is like 5k PageRank 5 and higher sites (that scrapebox can run) with low outbound links.. a cream if the crop list (#3, behind captcha and gov/edu links). This is a high quality list, small, targeted, and to the point. I would first run it using public proxies, then run the failed list with 20-30 private proxies. These high quality URLs are worth increasing the risk of your private proxies, however your still wanting to be smart about it. Wordpress's Akismet is like freaking ADT for blogs! Not everyone runs Akismet, but it comes stock with Wordpress, so quite a few sites do. Honestly, this is why a lot of people get shitty success rates at wordpress. 

It's because their IPs are on the Akismet block list. You don't want your Private IPs to get on that list. So this is why I normally make my lists small, and super scrub them. A good portion of high pagerank backlinks can make a huge difference for your website, even be the factor that gets you 1st page. It could be make or break. So if I can snag 1 or 2 thousand PR3 and higher backlinks, it's worth it. However, be super careful. Getting top rankings may be worth buying a new set of private IPs :) but we would rather live on for another site if we can lol. I for SURE use Private Proxies on high pagerank captcha blogs list, however I scrub the list first. All alive, PR checked (only keeping say PR3 or PR4 and higher), low OBL, and platforms Scrapebox can run. Your really high URLs, just do by hand using the manual poster. You can do 300 or so URLs that are say PR6+. If you write a good post, you will get a high success rate.

Some things are best to do by hand (like really high PR captcha blogs). However, we use the manual poster to assist in the task :) These are just too good of backlinks to risk using automatic posting with our half-ass spun comments list, names, ect. I recommend mixing up your anchor text though. Sometimes use your name, sometimes use your keyword(s), sometimes use fake names that look real.. When using keywords, you get a lower success rate.. However it is common for humans on their site legitly to use their anchor text when posting good comments. They are making a good post, but just being SEO aware. Some admins don't care, others will reject all keyword names. Especially if your keyword is the done a million times "make money online" or "lose weight" crap. So if your in small niche's that haven't been spammed hardcore, you can get more keyword anchor text comments to go through.

Why are captcha blog URLs so valued?

Scrapebox can EAT captcha's, and we generally don't have our captcha solver on when doing 100k or even 30k mass blasts. Usually, after running, we save off our failed urls, success urls, and captcha urls. Building a large list of nothing but captcha URLs to clean and run later ;) Theres tons of PR0 and other low level sites that have captchas on, and if we are paying for them, we are going to use them on something worthwhile! I don't mind burning through $10 worth of captchas in a few days.. however I am not going to do that on shitty low PR sites! So captcha blogs usually have a lower OBL number (less comments) and are generally less spammed. There are some auto-approve captcha blogs out there too :) Those are nice. I have a list of 2k auto-approve captcha blogs right now lol. They don't last forever though, once they get found the owner quickly switches to moderated posts. Generally for blogs to get a high pagerank, they need good comment spam control. So most say PR5 blogs you see, they have captchas on the comments, and they must be reviewed before being posted. So you do those by hand, to increase the odds of them getting approved.

Find a balance between what you do by hand, and what you let the software do for you. We are lazy, and nobody wants to sit for 4 hours writing comments! Even if we have scrapebox easily going to the next post for us! So you cut off the top best portion of your list to do by hand. If you don't have anything higher than maybe PR3, then don't do them by hand. There are plenty of them out there! Set a bar. For me, each list is different. If I start with say PR5 and higher, and I still have too many URLs to do by hand.. I will then do PR6 and higher.. if I still have too many, PR7 and higher.. Until I get a reasonable number of good sites to do by hand. Maybe 100-300. Get your own standards, patterns, routines.. Some lists you don't have any higher than PR6... and if you have too many PR6.. if you don't care about it too much, then you just run it automatic. I do recommend having at least 1,000 of your backlinks done by hand! That doesn't mean all in 1 day.. but over a month, and a handful of runs, to have a total of like 1k manually posted comments. If you stick with doing only quality, badass, nice, stong sites for your "manual" submissions... Then you will have a nice core of good backlinks.

Gov/Edu backlinks?

These are your best of the best backlinks... sometimes :) There are still shitty ones.. ones that have been spammed to death, or have n/a pagerank. I do accept some PR0 gov and edu links! They still are golden.. but I remove the garbage pages. When harvesting for them, you will get a dozen of each domain usually.. Just try to clean them up, remove high OBL sites.. if you have a ton of 1 domain, and half of their pages are PR na or 0, remove them. If you have PR3 or higher ones... I would post by hand.. they are gov/edu sites and you wont have THAT many to sift through! Do the low PR ones automatically with SB, and the high PR ones manually with SB. I would run the failed urls again. (Also, I would turn on captcha solution for these too. You want to get every extra gov/edu link that you can and we don't want to run then sort out the captchas to have to run again.. save a step since teh total number is small, its not like you get 100s of thousands of gov/edu sites. 

You may get 5k if your lucky!). After running the failed URLs twice, if you have only a couple (2 or 3 thousand and most) thousand left, run your private proxies on them. You get a lot of other blog platforms, but still a good amount of wordpress, so we are still worried about getting on blocklists. Use a low # of threads, maybe 30 max. Depending on the size of your list.. You dont want to take 24 horus to run the list, but if it takes 2 hours at 10 threads thats fine. Lower threads means better success rate, but longer time to complete the task. So again, a good balance between the two factors. Also, a low rate will lower our risk of getting blacklisted. It still is a risk, but we are trying to not be stupid about it! If I lose 20 private proxies to get 300 gov/edu backlinks to go live, it's worth it to me.

I don't want to stick my neck out there if I don't have to (meaning risking my private proxies).. but when you have been doing SEO for a while, you know what will really make a difference. If you really think those few hundred gov/edu backlinks, or the chunk of awesome high PR captcha backlinks, will make a difference and possible be the factor to get you to the first page of Google... Then it may be worth it :) You need to factor in the keyword your ranking for, the niche, the competition, everything goes into my judgement. If I was ranking for "make money online", which is a "dream" keyword probably nobody here will rank for, then I would say it isn't worth it. We are going for a dream keyword, with super competition, and basically its a LONGGGG shot at best! So don't risk them for shit like that. However, in side niche's, I have done a strong SEO campaign and went to Page 1 in a month. Boom, 1k visitors/day site, making over $1k in commissions a month... So it was worth losing $50 worth of private proxies. just buy some more, and continue on. $50 was nothing compared to the rewards and totally worth it... 

Your keyword research should be setting you up for totally possible sites/keywords that can make you a few hundred bucks a month at least! If you are pissing with keywords that will bring you barely 20 people a day, then you really need to rethink.. umm.. basically rethink everything. Yeah, site's die off.. but shit, it's easy to pop up sites that make $300-$500/month for easily around 4-6 months.. Sometimes you get some try hard sites that last for over a year.. and some that die off soon after making a few hundred bucks.. You can try to breathe life back into it (if the traffic was pant shitty, jaw dropping, totally worth fixing), or move on to the next. For your sites that are doing well for you, I suggest every month to every other month, to do another SEO push. Just because you have the rankings now, doesn't mean you will a week later! They can change, and doing simple routine SEO on sites already ranking, can help keep them ranking!

It all comes down to using good judgement. I baby and shelter my private proxies... however, I know when it's time to bring the big boys out. My trusty sidekicks in SEO. They are my "A Team".. 90% of the time, I am running my "B" and "C" teams though. You do a large harvesting job, or do a large PR job, and you could end up with worthless private proxies. I hear once your blacklisted by Google that your also blacklisted for ReCaptcha, which is the common captcha on blogs. I haven't went out and tested it, but if that is true, once your blacklisted by Google your screwed for like 3/4 your captcha blogs! Also, I think Akismet and Google talk? Hell Google could own Akismet, they own about everything anymore lol. So it could be possible if your blacklisted by Akismet, your blacklisted by Google or vice-versa. It is possible, and was shown to be implemented with Craigslist back in the day. Google and CL shared a blacklist database. Since people were mass creating GMail accounts to use to spam CL with popular software at the time. 

They teamed up, and once you got hit you were hit with both! Get blacklisted by google? Boom all your related CL accounts with that IP were dead. Get blacklisted by CL, boom all your gmail accounts were dead. Now they were using proxies, ect. However google can tie further than that. So I think they had CCLeaner in there to run between postings lol. That just goes to show you the importance of WHY you don't want to get blacklisted by both, Google or Akismet.. It could ruin your whole day lol. Your not aware, but because you just harvested 50k URLs with Scrapebox, your posting to 5k PR5+ Captcha blogs is a waste of time.. You could be stopped by Akismet or ReCaptcha, all because you got your private proxies blacklisted by Google.. Whether it is actually tracked that deep right now, I don't know... but it is possible. however, I do know once your hit by Akismet, your success rates will plumet. So imagine adding in getting blocked by ReCaptcha!

Some of that was just theory, or part of my "thinking process".. Using Scrapebox is easy.. Anyone can load up some lists, and blast away.. However, what makes a good SEO expert different than an average SEO person... is their process. I tend to do less number of backlinks, but yet higher quality. I out rank tons of people, with only a couple thousand backlinks.. While they have 10k or more.. Quality trumps Quantity. Maybe I am too cautious? Maybe I am too paranoid and thinking too deep about whether Google and Akismet and Recaptcha, are all linked.. However, history has shown that Google has done it in the past to combat spam. 

Google hates spam, hates Scrapebox, and it's only logical to think that Google will assist in fighting against spam.. It fucks with thier ranking systems and costs them money. As we get more advanced, Google has to fork out tons of cash to a team of nerds to "re-work" the algorithm. Google's goal is to be the best search engine, and it can only do that by having the best algorithm to produce the best rankings. We, are a pain in Google's ass. So it only makes sense that Google -> Akismet -> Recaptcha... would be linked somehow to fight back against spam.

Okay Okay.. Mr. Smarty Pants Deep Thinking Nutcase... What if your wrong? Well, I don't care. Truth is, I know it is an option that is extremely possible. Since Google has shown us they will team up with other companies to combat spam, then we can't eliminate it as an option. EVEN IF they are not doing it right now... Think about this.. What if Google started to do this? Next update they team up, and say "ohh" and bitch slap all of us.. This could be for backlinking we did a year ago.. However, google can go back and re-analyze all our data. With their new "database", they can spot out everyone (that triggered the alarms along the way). Then boom, de-index all of them. Yes, google can do it. They HAVE de-indexed sites for bad backlinking practices taht they have done in the past. Just because you are safe to do it now, doesn't mean later they won't change their mind!

It would be like if Google said "starting today, anyone who has ever gotten a backlink from a wordpress blog will be de-indexed".. Just because you say ok i know not to use Wordpress anymore... Your still de-indexed because a year ago you did... even though the update wasnt alive back then! So just think about it... If we go through blasting away, tripping all the alarms along the way... A year from now, your sites could be at risk.. If you left enough of a trail. It wouldn't be hard either, your doing a shit ton of posts lol.

So I try to fly under the radar. I think a little deeper than most, yes.. However, I still get 1st page, and I still generate thousands of backlinks! I don't want to lose tons of sites, for being careless in the past. That is a lot of money to lose.

If I could sum it up.. it would be "Percision". My SEO is lazer targeted, and precise. I don't need to drop a nuke, when I can send in the Navy Seals.

Do you even need to use proxies?
This depends on what program your running, what your doing.. and if you can change your own IP address.

For example, I don't use private proxies with SENuke. I use a different IP address for EACH FULL RUN of SENuke. If I am doing a run for 3 clients.. I change my IP between each client's run and run CCLeaner. If you had 10 private proxies... what happens after you have used all 10 of them? YOu will have tons and tons of runs, repeating the same IP addresses.. I don't want to get my IP flagged when doing a run, and lose half my work on a past client's project. If your IP gets flagged by wordpress, boom, all of your wordpress sites using that IP could go down with it! So if you had say 50 wordpress sites made, all using the same IP, for maybe 30 clients... well they all lose their wordpress backlink.. If it it a link pyramid system, and you lose a Level 1 property, and it has 100 sites pointing to it... well you broke your pyramid and it just got weaker! So we don't want anything to tie our projects together. 

It isn't right to risk 20 clients' rankings. A different IP, each and every time! I run EVERYTHING in SENuke with the same IP. If I have to do another job, I change my IP and repeat... Private proxies here will limit us, and arent the best option. Your REAL IP will deliver the BEST success rates.. and SENuke is just worth getting everything you can out of it! Also, have a good captcha solution. I do the Social Network sites by hand.. the rest, auto captchas because there are too many. I also lower my threads to 5 and retries to 4.

I am currently looking for a good captcha solution. I was using DeathByCaptcha, but tried ImageTypers to try and get better results. Seems they have an OKAY success rate at 5 threads, but 10 threads is too much.. SENuke flat out said, your problem is your captcha solution! WTF! So I have been running SENuke slow at 5 threads, 4 retries.. and I retry my failed sites again. I do recommend 5 threads now anyways and 4 retries, even with a badass accurate captcha solution. Just so you can be sure to get better success rates. 

So right now, I am on the hunt to get better captcha rates. I have a few other places to try. Moral to the story, if you think your missing sites due to not being able to get the catchas, then try a few other places. Your not going to get perfect, but damn less than 50% is terrible!

So I said you don't need to use proxies with SENuke. You DO NEED to use proxies with Scrapebox, xRumer, GSER, and any harvesting activites. Also, anything that may link you. Wordpress has Akismet, else we could hit one site and the others not know we just tried 20k other sites... If it wasn't for Akismet, we wouldn't need proxies for wordpress because nothing would be tracking us. Each site would be blind. The same is true with forums. There are things like Akismet.

I don't know if they come out of the box with it (like checking spamhaus database), or if default they are all blind too. but they have common plugins basically to link them to a database to combat spam? Not sure that answer.. It doesn't matter though, because you also have the worry about the captchas. If you post to 30k sites that all have recaptcha... think nobody would notice? Recaptcha would.. So we need to hit a LOT of sites, in masses, hard and fast... we use public proxies. While a small % of them may be blind, there is no way to sort them out and therefore after maybe 5k or 10k sites, your IP would be blacklisted, and your success rate will die! You may hit the blind sites.. but your missing over 80% of your successful posts.

BookmarkingDemon - It handles a handful of bookmarking platforms. Again, captcha issue, and possibility of a "blacklist network".. You may hit 10k sites on 6 different platforms... not enough diversity.. and we are sure that running just our real IP would result in blacklisting somwhere and kill our success rates. So public proxies here again.. If you have high pagerank URLs, then split them off and use private proxies on a small list. Lower speed, at at least 10 private proxies, and maybe 2k-3k URLs at MOST! And yeah, I would use a captcha solution.

I know I know that SENuke has a bookmarking module of like what 100-200 sites? You can import URLs into there too! Scrape the bookmarking platforms with Scrapebox or hrefer. Filter and scrub them, and use them in SENuke or Bookmarking Demon. We use our REAL IP with SENuke though.. so remember that! Keep your bookmarkin list small and powerful here.. If you want to have a huge list here, then run it by itself in SENuke and turn your proxies on first! You dont want them on for the whole SENuke run, just for the bookmarking module.

ArticleMarketingRobot - Same thing as before.. We have a handful of different platforms only. We have to watch out for blacklists... However, there just aren't nearly as many "article" sites as there are blogs, forums, or bookmarking sites... soo it depends on how big your list is! 100-200 max you can use your own IP at a low speed. or you could use 10 private proxies.. 1k-2k maybe 10 private proxies.. 3k to 5k, maybe 20 or 30 private proxies.. Any more than that, you need to segment the lists.. PR4 and higher use private proxies, and lower use public proxies.. again, come up with your own figures, based off the size of your list, number of private proxies you have, ect. 

Very small you can use your own IP.. Medium = you can use your private proxies.. If it's too big of a job putting your proxies at risk, then reduce that risk by using public proxies on the lower links and private proxies on your better ones :)

xRumer - public proxies. You flat out wont get far using your real IP address, and it has a badass proxy system that auto harvests, auto checks, and always provides you with a pool of good public proxies! You can add in your own list of sources, which I highly recommend! Don't use the default leech list it comes with, make your own! So your not using the same proxies hundred of xrumer skids are using!

After google penguin,You must use Xrumer it more carefully.Follow Next 2 tips.

1. Use private proxies.IF without any proxy your IP will be banned  fast and your xrumer will be useless. So if you use it with private proxies or share proxies your bans will be less frequent so your success rate will be bigger.

2. Don’t spam your site directly, You can blast on your Web2.0 site that site link to your innerpage(don’t link to your homepage) or linkwheel page.

With xrumer you can build your own modules, or buy them.. For example.. You can have xrumer setup JUST for social bookmarking platforms only.. Meaning that is ALLL it knows! You can have one for social networking, link wheel, link pyramid creation.. Like the social network module in SENuke... Only you can have a TON of more sites in it :) 

If your doing a small submission like this, of a few hundred sites... you could use your real IP or private proxies... However, large blasts you will want to use public proxies. So there are some cases where you could get by using your real IP... but for your lazer targeted, small runs. You could also change your IP in-between those runs.. Changing your IP doesn't help you when doing large runs though, just use public proxies!

hrefer harvesting, scrapebox harvesting...
Use public proxies! Your scraping a shit load of data, and its easy to get blacklisted by Google and other search engines! Actually, it WILL HAPPEN every time! You can lose half your public proxies just on a large harvesting job! Wait a couple days, and you will have a new pool of public proxies, and be all good again lol. Even 50 private proxies aren't enough for large scraping runs. You can get away with small, few thousand URL scrapes.. if your just going after a specific thing... like all the pages of a website (their indexed pages) or their backlinks, or the top 100 sites for a keyword.. but on large runs, you have to use public proxies. Even if you have 50 private proxies, that is a lot of money to throw out the window.. and you WILL LOSE some if not all of them!

You have to know when it's time to use private proxies. To know when to use them, you need to KNOW in what events you don't want to use them. Scraping 100k URLs, I am pretty damn sure you will lose ALL of them! So we don't use them for that! We don't use them for HUGE jobs.. we use them for our smaller, lazer targeted jobs. We use them where we need accuracy. We use them in situations where we can get away with it!

Public Proxies are like your "pawns" in chess. They are your bitches, and we are willing to sacrifice them!


Private Proxies are like your Queen. You only pull her out to clean house, and mop the table! When she comes out, it's because you mean business.. (hopefully, and not because you lost all your other pieces lmao!). She is powerful, can move all over, and your last line of defense before the King! You don't just go throwing your queen out there right off the bat, you will probably lose her.. Even if you don't.. Mathematically or statistically, the odds are great that you will lose her. So you don't risk your Queen unless you have to! Sometimes you may dodge the bullet.. but do your best to keep her! Sometimes, if your pieces are setup right, you can risk her if you need too.. but only for a checkmate!

Hopefully this will help some people out.. I know its a lot of reading.. I know this because my hand hurts now! So give thanks please :)

How and where to get public proxies, is beyond the scope of this post! There are tons of programs, articles, information out there for this. So go use Google! A lot of beginners may buy private proxies, without knowing how to use them, and waste $30 on harvesting 100k URLs right away. You may be getting shitty success rates with SENuke, you can use your own IP Address! UNLESS you can not change your IP. They may not know when is time to use the private proxies and be struggling with getting good, high pagerank backlinks...

Your results are only as good as your lists. I have built my lists in Scrapebox for years (since it came out). I have like 12 million URLs! Good house keeping will go a long way! I know what I have already ran... I have lists "pre checked".. I have lists "needing PR checked".. I have lists needing "alive checked" or "OBL checked".. I also have tons of lists already checked for each of them.. I have large lists of nothing but captcha sites! 

I have large lists of high PR sites.. I have lists of auto-approve sites.. After you run with scrapebox, it is GOOD to export your "failed", "success", and "captcha" urls and put them into lists.. For example, add your new found captcha list to your main one. So at will, you can load up a list of say 10k sites with captchas. I have a method to my madness.. To most, my lists look all over the damn place.. to me, I know what is what, and where it's at! 

I can get REAL 2k backlinks on a site with just scrapebox.. or I can get 200 gov/edu backlinks on a site.. or 500 PR5 and higher backlinks on a site.. I am talking, fire at will, load a list and get what I want.. Not load up a list of 100k and see what happens.. Load up a list of 10k URLs and get 1k of them to actually SHOW UP!!! You can only get this (or better) by separating your lists out in a method that works for you!

What is DSL?
Digital Subscriber Line, the technology used to transmit digital data over the telephone line
Your broadband uses DSL technology.
There's ADSL for asynchronous DSL, which is commonly use for home broadband, and SDSL, synchonous DSL


What is Dynamic IP?
http://www.makeuseof.com/tag/technology-explained-what-is-a-dynamic-ip-address-how-do-i-get-one/

A DHCP server assigned IP address is called dynamic because it will often be different on each connection to the network.
The public IP address assigned to the routers of most home and business users is a dynamic IP address. Some Internet Service Providers assign "sticky" dynamic IP addressses that do change but less frequently than a typical dynamic IP address.
Larger companies usually do not connect to the Internet via dynamic IP addresses and instead have static IP addresses assigned to them which do not change.
In a local network like in your home or place of business, where you use a private IP address, most devices are probably configured for DHCP and thus use dynamic IP addresses. However, if DHCP is not enabled and you've configured your own network information, you're using a static IP address.

How to change your IP? 
If your on DSL you can usually just unplug your modem and plug it back in. When it restarts, you usually get a new IP. If your on Cable, your still considered a "dynamic" IP, which means your IP CAN change.. However, most cable companies use DHCP Leasing! 

This means for usually 24 hours your IP is locked to you! However, that IP is usually locked to the FIRST MAC ADDRESS found on your network. If your directly connected to your modem, then it's the IP of your network card in use. If you have a router, then it's the MAC Address of the router. You can then use the "MAC Address Clone" section to change your MAC Address of your router. Save. 

Reboot both your router and your modem. (router first!). The exacts of how to change your IP Address are beyond the scope of this post! I CAN NOT HELP YOU! Everyone has a different IP, a different setup, ect. It may take some searching, some reading, some testing and trial and error.. I just listed the two main possibilities.. either power cycle your modem. 

If you don't get a new IP after it comes back up.. then you need to change the MAC Address of the first device on your network. The IP is locked to the MAC, so if you change it, it forces a new IP. Your modem may be different, your network card may be different, and your router may be different. I gave you where to look, so it's up to you to dig deeper and find the answer for you.

{ 1 comments... read them below or add one }

  1. This post is really amazing, i got detailed information regarding Private proxies and now
    i am in better position in internet surfing.

    ReplyDelete

Welcome to My Blog

Popular Post

Blog

Scrapebox Proxies Used

Main Scrapebox GUI:

Fast Poster - Used
Slow Poster - Used
Manual Poster - Used
RSS Ping - Used
Link Checker - Not used


Addons:
Backlink Checker
- Used
Fake PR Checker -Used
Dofollow/Nofollow Check - Used
Blog Analyzer - Used
Malware Addon - Used
Alive Check - Used
Google Image Grabber - Used
Google Competition Finder - Used

Rapid Indexer - Not used
Link Extractor - Not used
Dofollow Test - Not used
Sitemap Scraper - Not used

WhoIs Scraper - If you have SOCKS proxies they are used, HTTP proxies are not used



- Copyright © Proxies World - To Get The Best Proxies -Robotic Notes- Powered by Blogger - Designed by MapleSEA -