Having launched many sites, and redesigned many more, I have plenty of experience with getting web sites crawled and indexed.
I think submitting your URL to most major search engines is not the best way to get crawled. First, you will have to wait in the queue. For those that think that there isn’t a wait, take a look at the growth rate of the internet. If that doesn’t convince you, try to buy a URL that’s just one word, then try misspellings, then try adding another word to that word. Okay, so let’s say your the patient type. Your first crawl is usually shallow. Getting an inbound link from a site that is known to be regularly updated, and therefore regularly crawled, is an effective way to get noticed, particularly by Google. Given the assumption that the two sites are related, title, anchor text, etc. are optimized, it is my experience that the depth of crawl the site receives from that inbound link is directly related to the relative importance of the page with the outbound link. So getting a jump start on crawl depth can be important, especially if the site is large.
The second benefit of inbound links, is the additional exposure that helps deepen the crawl naturally. If you optimize a variety of pages and create inbound links for them, it will force the bot to recognize more of your site. Each time a bot hits your site, the bot has a radius it covers. If you think of it like getting hit with a water balloon; if the balloon keeps hitting you in the same place, it will take a very long time to get completely wet. (That is, if Googlebot deems your site fit enough to be indexed and not stuffed in the supplemental until it gets more information on whether it should let your site into the “real” index, which it can take months to do.) Each time the bot comes back to your site through a new avenue it will get exposed to different parts of your site, the crawl will get deeper and it will learn how much new content is being added and how often it needs to visit to keep up with it. Does it not make sense that your site will get crawled more frequently and deeply if the bot ends up there in a variety of locations frequently?
So why would you want to use “add your url”? Are there any benefits?