This document discusses using crowdsourcing to annotate web services for a search engine. It describes crawling web pages to identify APIs, but notes that human confirmation is still needed. An annotation wizard was created for Amazon Mechanical Turk workers to categorize and tag pages. Initial results showed low quality annotations, but limiting tasks and increasing pay improved accuracy rates to about 80%. Crowdsourcing was found to be an effective way to quickly generate high-quality annotations at low cost.