A major hurdle faced by many information retrieval researchers-especially in academia-is evaluating retrieval systems in the wild. Challenges include tapping into large user bases, collecting user behavior, and modifying a given retrieval system. We outline several options available to researchers to overcome these challenges along with their advantages and disadvantages. We then demonstrate how CrowdLogger, an open-source browser extension for Firefox and Google Chrome, can be used as an in situ evaluation platform.