Thursday, January 10, 2013

DJMark (Dj Markworm) - Practice Sessions hiphop-rnb

Monday, April 9, 2012

SEO: Xrumer processes explained!

POSTING PROCESS
    Xrumer will first check all the form fields found on the site against xignoreforms.txt.
    It will then check and evaluate the important of the form internally. The more form fields the form has, more likely xrumer will consider that form as new message form.
    If posting form is not found, xrumer will try to login to the site, with U: and P: credentials if found in the link list. Otherwise, it will follow the project manager user and pass settings.
 



    If login fails, xrumer will try to register for the site, which follows the registration process.
    After logging in, it will look for the post topic link (URL_TONEWPOST) in the URL_TOVIEW and URL_SUBVIEW in xurl.txt. If it doesn't recognize any posting forms, current site will be skipped.
    If posting form is found, it fills up the form accordingly to xas.txt
    a) Check posting status based on SUCCESS in xmessages.txt or the text or messages written in your project manager. If xrumer can see your message on the site, it is considered as success despite your xmessages.txt do not contain the keywords to indicate them.
         b) If unknown fields are found - Check if Self-Learning System is on.
    From this point onwards, it is similar to the self-learning process you see on registering process.
    If your “apply xrumer assumption on new field” is ticked in your self-learning system option, xrumer will submit the form based on x_apply.txt
    If your message is found:
         a) Save the editing link based on URL_TOEDIT in xurl.txt to your Elinklist.txt. It will go into “success” list in your report log.
         b) Save the editing link based on URL_TOREPLY in xurl.txt to your Rlinklist.txt. It will also go into “success” list in your report log.
    If the forum requires posts to be moderated by admins, it will still go to success list.
    If message is not found - It will go into partial success list in your report log.
    Invalid, banned and all other cases go to “the rest” list in your report log.


REGISTRATION PROCESS


    Xrumer will try to find the register link based on the xurl.txt on the given URL (either from the Link List or TEST mode).
    It will follow the register URL, and check the page has form field or not.
    If the field is found, it will check the form field name against your xas.txt.
    If XAS recognizes all the fields
         a)i)Check if there’s a Captcha in the form based on:
             -DeCaptcha/default.mask.txt
             -DeCaptcha/user.mask.txt
             As for ReCaptcha, Xrumer checks the JavaScript internally.
             As well as detecting the text Captcha based on textcaptcha.txt
             i) If CAPTCHA is recognized, decode the Captcha/Text Captcha based on the attempts set on Advance options, as well the attempts before using external Captcha service.
           ii) If total attempts are exceeded - Report log: Successful (this is in Xrumer 7 Beta 3) Ps: Not 100% accurate
    Xrumer will fill the form and submit, and look for keywords from INVALID, SUCCESS, ACTIVATION, TEXTCAP_FAILED, PICTOTRY, BLOCKED and TAKEN in xmessages.txt
    a) i)Check if “editing profile after registration” in advanced option is on or off.
       ii)If if checked, login and look for the profile link based on xurl.txt.
      iii)If not checked, It will be reported under “Successful” category in the report log.
    b) If server response is not recognized, it will be reported under “Part Succ” category in the report log.
    c) If registration requires activation, it will be reported under “Activation required” category in the report log.
    f) If invalid/blocked, it will be reported under “The Rest” category in the report log.

Monday, March 28, 2011

Market Samurai 0.87.7 Cracked Rapidshare

Everything you need bundled together to get rolling properly and all for FREE BABY!!!!!
When you open up Market Samurai you will find the following modules:

* Keyword Research – Find additional related keywords to your seed keyword and analyze their traffic, Adwords value, competition, buying intent and more.
* Search Engine Optimization Competition – Market Samurai will take a look at the top ten results for your keyword and show you how well they rank for a number of important Search Engine Optimization factors.
* Rank Tracker – Add in your domains and keywords, and Market Samurai will show you where your sites rank for each of your keywords over time on Google, Yahoo and Bing.
* Monetization – Find products to promote on Clickbank, Amazon, CJ, and PayDotCom.
* Find Content – Find content related to your keywords, which you can add to your website or blog.
* Publish Content – Manage all your WordPress blogs in one place and publish your content to them easily
* Promotion – Find Web 2.0 sites, blogs and forums that are related to your keyword where you can place your links on.

When building niche websites, doing the proper keyword research is the key to success. You want to find keywords that get a lot of traffic, but don’t have so much competition that they are impossible to rank for. To start off with the keyword research module, you first need to enter in a seed keyword into Market Samurai. Hit the Generate Keywords button and Market Samurai will go out to Google and return you a list of relevant keywords to your seed keywords. You can filter this initial list by a variety of criteria, like traffic, phrase length, and positive and negative keywords.


Recommend You must have a 'Google Adwords' username and password for it to work (Market Samurai Keyword Research)

vtscan
Code:
http://www.virustotal.com/file-scan/report.html?id=14ef48e4e1b1acbc6651be48144de02c1133b8c607967c749cddc9580ed3de6e-1300704182



download
Code:
http://hotfile.com/dl/111190686/7690a2d/MarketSamurai.rar.html

Sunday, February 27, 2011

Spam Spam Spam.

As an SEO consultant, you should always remember that a SPAM is delicious in the real world. but in the SEO world it taste so bad!.

Remember when commenting or posting forums. post it right. do not spam!


Friday, February 11, 2011

Senuke download

INSTRUCTIONS
Follow the steps below and you're good to go...
1. Uninstall previous version of SEnuke.
2. Run Ccleaner and restart the computer.
3. Install Senuke 6.94.
Insert this password kw6n7492urPqlknioKje8910knqq
4. Copy and paste msvcr100.dll to installation directory of SEnuke
E.g. C:\Program Files\SENuke
5. Run Senuke.exe and input any email and password...


Saturday, January 2, 2010

What is a linkwheel?

Linkwheel or link wheel system is been here for a long time. SEO consultants like me is been using this method to build links and promote a website. a link wheel is very simple. but it takes great SEO knowledge to build it. or else you will be risking your website being banned from Google.

So my advice to you is that you should use the linkwheel wisely.

an overview of what is a linkwheel





Remember use the linkwheel method wisely. dont say i didn't warn you.


Tuesday, December 29, 2009

Robots.txt - what is robots.txt













What Is Robots.txt?

Robots.txt is a text file in your site root folder, it tells search robots bots such as googlebot which pages you would like them not to crawl. it is not required that you should have robots.txt on your website, but in some case if needed then you should put robots.txt on your website. robots.txt is like note, and tells search engine bots such as google where to crawl or what webfloder they are allowed to crawl.


remember that robots.txt must be in the main directory (root directory of your website) so that Search engine bots such as google bot can be able to find it. Search Engine Bots do not search the whole site for a file named robots.txt. Instead, they look first in the main directory.

http://www.yourdomain.com/robots.txt

robots.txt is been here since the early stage of the internet. you can visit http://www.robotstxt.org/ to learn more about robot.txt


The structure of a Robots.txt File is just simple.

User-agent: - search engine crawlers name.

Disallow: - lists the files and directories to be excluded from indexing

you can include comment lines, just input the # sign at the beginning of the comment reffer to the example below.

# This is a comment in robot.txt

--------------------SAMPLE CONTENTS OF Robot.txt -----------------------

User-agent: *

Disallow: /temp/

------------------------------------------------------------------------

to avoid serious problems and logical errors. please follow the basic syntax/format of robots.txt , or you can use a validator for robots.txt to validate your robot.txt file and check for errors before uploading it.