On Complex Supervised Learning Problems, and On Ranking and Choice Models
While simple supervised learning problems like binary classification and regression are fairly well understood, increasingly, many applications involve more complex learning problems: more complex label and prediction spaces, more complex loss structures, or both. The first part of the talk will discuss recent advances in our understanding of such problems, including the notion of convex calibration dimension of a loss function, unified approaches for designing convex calibrated surrogates for arbitrary losses, and connections between supervised learning and property elicitation. The…