ICML-2000 Workshop on Cost-Sensitive Learning

Dragos Margineantu margindr at CS.ORST.EDU
Wed Mar 8 14:27:30 EST 2000


          Invitation to Participate, Call for Contributions

                 WORKSHOP ON COST-SENSITIVE LEARNING
                 -----------------------------------

[In conjunction with the Seventeenth International Conference on
Machine Learning - ICML-2000, Stanford University June 29-July 2, 2000]

Workshop webpage is at:
http://www.cs.orst.edu/~margindr/Workshops/Workshop-ICML2000.html


                 Workshop Motivation and Description
                 -----------------------------------

Recent years have seen supervised learning methods applied to a
variety of challenging problems in industry, medicine, and science.
In many of these problems, there are costs associated with measuring
input features and there are costs associated with different possible
outcomes.  However, existing classification algorithms assume that the
input features are already measured (at no cost) and that the goal is
to minimize the number of misclassification errors (the 0/1 loss).

For example, in medical diagnosis, different tests have different
costs (and risks) and different outcomes (false positives and false
negatives) have different costs.  The cost of a false positive medical
diagnosis is an unnecessary treatment, but the cost of a false
negative diagnosis may be the death of the patient.  Given a choice, a
cost-sensitive learning algorithm should prefer to measure less costly
features and to make less costly errors (in this example, false
positives).  Not surprisingly, when existing learning algorithms are
applied to cost-sensitive problems, the results are often poor,
because they have no way of making these tradeoffs.

Another example concerns the timeliness of predictions in time-series
applications.  Consider a classifier that is applied to monitor a
complex system (e.g., factor, power plant, medical device).  It is
supposed to signal an alarm if a problem is about to occur.  The value
of the alarm is not merely related to whether it is a false alarm or a
missed alarm, but also to whether the alarm is raised soon enough to
allow preventative measures to be taken. 

The goal of this workshop is to bring together researchers who are
working on problems for which the standard 0/1-loss model with
zero-cost input features is unsatisfactory.  A good reference on
different types of costs, and cost-sensitive learning can be found at:
http://www.iit.nrc.ca/bibliographies/cost-sensitive.html.

The workshop will be structured around three main topics:

* ALGORITHMS FOR COST-SENSITIVE LEARNING
  Algorithms that take cost information as input (along with the
  training data) and produce a cost-sensitive classifier as output.
  Algorithms that construct robust classifiers that accept cost
  information at classification time.
  Algorithms designed for other types of costs.

* COSTS AND LOSS FUNCTIONS THAT ARISE IN REAL-WORLD APPLICATIONS
  What types of costs are involved in practical applications?
  What are the loss functions in current and future applications of
  machine learning?
  What are the right ways of formulating various cost-sensitive
  learning problems?

* METHODS AND PROMISING DIRECTIONS FOR FUTURE RESEARCH
  What methods should be applied to evaluate cost-sensitive learning
  algorithms?
  What are promising new directions to pursue?
  What should be our ultimate research goals?

Approximately one third of the day will be devoted to each of these
three topics.  On each of these topics, one or two people will give
overview presentations.  These will be followed by a mix of discussion
and short position papers presented by the participants.

                             Submissions
                             -----------

To participate in the workshop, please send a email message to Tom
Dietterich (tgd at cs.orst.edu) giving your name, address, email address,
and a brief description of your reasons for wanting to attend.  In
addition, if you wish to present one or more position papers on the
topics listed above, please send a one-page abstract of each position
paper to Tom Dietterich at the same email address.  You may submit a
position paper on each of the three main topics (algorithms, loss
functions, future research).  If you have an issue or contribution
that is not covered by these three categories, please contact Tom
Dietterich by email to discuss your idea prior to submitting a
position paper.  Submissions are especially solicited that describe
the loss functions arising in industrial applications of machine
learning. 

The organizers will review the submissions with the goal of assembling
a stimulating and exciting workshop.  Attendence will be limited to 40
people, with preference given to people who are presenting position
papers.

Important dates:
- Submission deadline: April 24,2000
- Notification of acceptance: May 8, 2000
- Workshop will held between June 29 and July 2, 2000 
  (to be announced by mid-March)

                              Organizers
                              ----------

Tom Dietterich, Oregon State University (tgd at cs.orst.edu)
Foster Provost, New York University, (provost at stern.nyu.edu)
Peter Turney, Institute for Information Technology of the 
      National Research Council of Canada (Peter.Turney at iit.nrc.ca)
Dragos Margineantu, Oregon State University (margindr at cs.orst.edu)





More information about the Connectionists mailing list