Science missions have limited lifetimes, necessitating an efficient investigation of the field site. The efficiency of onboard cameras, critical for planning, is limited by the need to downlink images to Earth for every decision. Recent advances have enabled rovers to take follow-up actions without waiting hours or days for new instructions. We propose using built-in processing by the instrument itself for adaptive data collection, faster reconnaissance, and increased mission science yield. We have developed a machine learning pixel classifier that is sensitive to texture differences in surface materials, enabling more sophisticated onboard classification than was previously possible. This classifier can be implemented in a Field Programmable Gate Array (FPGA) for maximal efficiency and minimal impact on the rest of the system's functions. In this paper, we report on initial results from applying the texture-sensitive classifier to three example analysis tasks using data from the Mars Exploration Rovers.