What I worked on:
– Trained the Azure API model on thousands of labeled images to recognize signs of plant distress and environmental damage (current precision: 84.3%)
– Used OpenCV to bring the model to life with live camera detection
– Exported it to CoreML for mobile testing across platforms using Xcode and Swift
– Aligned the work with 4 AI for Social Good goals: Climate Action, Life on Land, Good Health, and Responsible Consumption