A new paradigm for image search metadata collection is emerging exemplified by the Human Computation School's application of gaming principles to information science search challenges. In parallel, a suite of Web 2.0 interface applications for visual search have recently appeared opening new interactive possibilities and visual metaphors for navigation. This article briefly introduces this paradigm shift and then looks critically toward wider innovation with an eye on fresh territory. Arbitraging differing methodologies opens new visual search possibilities, as affordances and differences between models present opportunities to leverage inefficiencies in one model with efficiencies of the other. This article capitalizes on such inequities, prescriptively suggesting a synergistic path for combining new image-retrieval metadata methodologies with new frontend visual search directions for future application innovation.