[ Full Publication list: English, Japanese ]
[ BibTeX: English, Japanese ]

Publications

[ Journals, Major Conferences, Books, Articles in Books, Others, Patents, Theses ]

NOTE: The articles available below are not necessarily equivalent to those published in journals. Please refer to the respective journals for final versions.


Journals

  1. Sainui, J. & Sugiyama, M.
    Unsupervised dimension reduction via least-squares quadratic mutual information.
    IEICE Transactions on Information and Systems, vol.E97-D, no.10, pp.2806-2809, 2014.
    [ paper ]

  2. Calandriello, D., Niu, G., & Sugiyama, M.
    Semi-supervised information-maximization clustering.
    Neural Networks, vol.57, pp.103-111, 2014.
    [ paper ]

  3. Tangkaratt, V., Mori, S., Zhao, T., Morimoto, J., & Sugiyama, M.
    Model-based policy gradients with parameter-based exploration by least-squares conditional density estimation.
    Neural Networks, vol.57, pp.128-140, 2014.
    [ paper ]

  4. Yamada, M., Sugiyama, M., & Sese, J.
    Least-squares independence regression for non-linear causal inference under non-Gaussian noise.
    Machine Learning, vol.96, no.3, pp.249-267, 2014.
    [ paper ]

  5. Nguyen, T. D., du Plessis, M. C., Kanamori, T., & Sugiyama, M.
    Constrained least-squares density-difference estimation.
    IEICE Transactions on Information and Systems, vol.E97-D, no.7, pp.1822-1829, 2014.
    [ paper ]

  6. Niu, G., Dai, B., Yamada, M., & Sugiyama, M.
    Information-theoretic semi-supervised metric learning via entropy regularization.
    Neural Computation, vol.26, no.8, pp.1717-1762, 2014.
    [ paper ]

  7. Simm, J., Magrans de Abril, I., & Sugiyama, M.
    Tree-based ensemble multi-task learning method for classification and regression.
    IEICE Transactions on Information and Systems, vol.E97-D, no.6, pp.1677-1681, 2014.
    [ paper ]

  8. du Plessis, M. C. & Sugiyama, M.
    Class prior estimation from positive and unlabeled data.
    IEICE Transactions on Information and Systems, vol.E97-D, no.5, pp.1358-1362, 2014.
    [ paper ]

  9. Liu, S., Quinn, J., Gutmann, M. U., & Sugiyama, M.
    Direct learning of sparse changes in Markov networks by density ratio estimation.
    Neural Computation, vol.26, no.6, pp.1169-1197, 2014.
    [ paper ]

  10. Sakai, T. & Sugiyama, M.
    Computationally efficient estimation of squared-loss mutual information with multiplicative kernel models.
    IEICE Transactions on Information and Systems, vol.E97-D, no.4, pp.968-971, 2014.
    [ paper ]

  11. Kanamori, T. & Sugiyama, M.
    Statistical analysis of distance estimators with density differences and density ratios.
    Entropy, vol.16, no.2, pp.921-942, 2014.
    [ paper ]

  12. Quinn, J. & Sugiyama, M.
    A least-squares approach to anomaly detection in static and sequential data.
    Pattern Recognition Letters, vol.40, pp.36-40, 2014.
    [ paper ]

  13. du Plessis, M. C. & Sugiyama, M.
    Semi-supervised learning of class balance under class-prior change by distribution matching.
    Neural Networks, vol.50, pp.110-119, 2014.
    [ paper ]

  14. Sugiyama, M., Niu, G., Yamada, M., Kimura, M., & Hachiya, H.
    Information-maximization clustering based on squared-loss mutual information.
    Neural Computation, vol.26, no.1, pp.84-131, 2014.
    [ paper ]

  15. Yamada, M., Jitkrittum, W., Sigal, L., Xing, E. P., & Sugiyama, M.
    High-dimensional feature selection by feature-wise kernelized lasso.
    Neural Computation, vol.26, no.1, pp.185-207, 2014.
    [ paper ]

  16. Niu, G., Dai, B., Shang, L., & Sugiyama, M.
    Maximum volume clustering: A new discriminative clustering approach.
    Journal of Machine Learning Research, vol.14 (Sep.), pp.2641-2687, 2013.
    [ paper ]

  17. Sainui, J. & Sugiyama, M.
    Direct approximation of quadratic mutual information and its application to dependence-maximization clustering.
    IEICE Transactions on Information and Systems, vol.E96-D, no.19, pp.2282-2285, 2013.
    [ paper ]

  18. Sugiyama, M., Yamada, M., & du Plessis, M. C.
    Learning under non-stationarity: Covariate shift and class-balance change.
    WIREs Computational Statistics, vol.5, no.6, pp.465-477, 2013.
    [ paper ]

  19. Sugiyama, M.
    Distance approximation between probability distributions: Recent advances in machine learning.
    Transactions of the Japan Society for Industrial and Applied Mathematics, vol.23, no.3, pp.439-452, 2013.
    [ paper in Japanese ]

  20. Nakajima, S. & Sugiyama, M.
    Recent advances in variational Bayesian learning theory.
    Transactions of the Japan Society for Industrial and Applied Mathematics, vol.23, no.3, pp.453-483, 2013.
    [ paper in Japanese ]

  21. Sugiyama, M., Suzuki, T., Kanamori, T., du Plessis, M. C., Liu, S., & Takeuchi, I.
    Density-difference estimation.
    Neural Computation, vol.25, no.10, pp.2734-2775, 2013.
    [ paper ]

  22. Khan, R. R. & Sugiyama, M.
    Semi-supervised least-squares conditional density estimation.
    International Journal of Scientific Engineering and Technology, vol.2, no.9, pp.900-904, 2013.

  23. Yamanaka, M., Matsugu, M., & Sugiyama, M.
    Salient object detection based on direct density-ratio estimation.
    IPSJ Transactions on Mathematical Modeling and Its Applications, vol.6, no.2, pp.78-85, 2013.

  24. Yamanaka, M., Matsugu, M., & Sugiyama, M.
    Detection of activities and events without explicit categorization.
    IPSJ Transactions on Mathematical Modeling and Its Applications, vol.6, no.2, pp.86-92, 2013.

  25. Suzuki, T. & Sugiyama, M.
    Fast learning rate of multiple kernel learning: Trade-off between sparsity and smoothness.
    The Annals of Statistics, vol.41, no.3, pp.1381-1405, 2013.
    [ paper ]

  26. Nam, H., Hachiya, H., & Sugiyama, M.
    Computationally efficient multi-label classification by least-squares probabilistic classifiers.
    IEICE Transactions on Information and Systems, vol.E96-D, no.8, pp.1871-1874, 2013.
    [ paper ]

  27. Nakajima, S., Sugiyama, M., & Babacan, D.
    Variational Bayesian sparse additive matrix factorization.
    Machine Learning, vol.92, no.2-3, pp.319-347, 2013.
    [ paper ]

  28. Jitkrittum, W., Hachiya, H., & Sugiyama, M.
    Feature selection via l1-penalized squared-loss mutual information.
    IEICE Transactions on Information and Systems, vol.E96-D, no.7, pp.1513-1524, 2013.
    [ paper ]

  29. Nakata, K., Sugiyama, M., Kitagawa, K., & Otsuki, M.
    Improved algorithm for multiwavelength single-shot interferometric surface profiling: Speeding up the multiwavelength-integrated local model fitting method by local information sharing.
    Applied Optics, vol.52 no.17, pp.4042-4053, 2013.
    [ paper ]

  30. Sugiyama, M., Liu, S., du Plessis, M. C., Yamanaka, M., Yamada, M., Suzuki, T., & Kanamori, T.
    Direct divergence approximation between probability distributions and its applications in machine learning.
    Journal of Computing Science and Engineering, vol.7, no.2, pp.99-111, 2013.
    [ paper ]

  31. Zhao, T., Hachiya, H., Tangkaratt, V., Morimoto, J., & Sugiyama, M.
    Efficient sample reuse in policy gradients with parameter-based exploration.
    Neural Computation, vol.25, no.6, pp.1512-1547, 2013.
    [ paper ]

  32. Xie, N., Hachiya, H., & Sugiyama, M.
    Artist agent: A reinforcement learning approach to automatic stroke generation in oriental ink painting.
    IEICE Transactions on Information and Systems, vol.E96-D, no.5, pp.1134-1144, 2013.
    [ paper, demo (mp4) ]

  33. Liu, S., Yamada, M., Collier, N., & Sugiyama, M.
    Change-point detection in time-series data by relative density-ratio estimation.
    Neural Networks, vol.43, pp.72-83, 2013.
    [ paper ]

  34. Yamada, M., Suzuki, T., Kanamori, T., Hachiya, H., & Sugiyama, M.
    Relative density-ratio estimation for robust distribution comparison.
    Neural Computation, vol.25, no.5, pp.1324-1370, 2013.
    [ paper ]

  35. Kimura, A., Sugiyama, M., Nakano, T., Kameoka, H., Sakano, H., Maeda, E., & Ishiguro, K.
    SemiCCA: Efficient semi-supervised learning of canonical correlations.
    IPSJ Transactions on Mathematical Modeling and Its Applications, vol.6, no.1, pp.136-145, 2013.
    [ paper ]

  36. Kimura, A., Sugiyama, M., Sakano, H., & Kameoka, H.
    Designing various multivariate analysis at will via generalized pairwise expression.
    IPSJ Transactions on Mathematical Modeling and Its Applications, vol.6, no.1, pp.128-135, 2013.
    [ paper ]

  37. Magrans de Abril, I. & Sugiyama, M.
    Winning the Kaggle Algorithmic Trading Challenge with the composition of many models and feature engineering.
    IEICE Transactions on Information and Systems, vol.E96-D, no.3, pp.742-745, 2013.
    [ paper ]

  38. Kanamori, T., Suzuki, T., & Sugiyama, M.
    Computational complexity of kernel-based density-ratio estimation: A condition number analysis.
    Machine Learning, vol.90, no.3, pp.431-460, 2013.
    [ paper ]

  39. Suzuki, T. & Sugiyama, M.
    Sufficient dimension reduction via squared-loss mutual information estimation.
    Neural Computation, vol.25, no.3, pp.725-758, 2013.
    [ paper ]

  40. Nakajima, S., Sugiyama, M., Babacan, D., & Tomioka, R.
    Global analytic solution of fully-observed variational Bayesian matrix factorization.
    Journal of Machine Learning Research, vol.14 (Jan.), pp.1-37, 2013.
    [ paper ]

  41. Sugiyama, M.
    Machine learning with squared-loss mutual information.
    Entropy, vol.15, no.1, pp.80-112, 2013.
    [ paper ]

  42. Yamada, M., Wichern, G., Kondo, K., Sugiyama, M., & Sawada, H.
    Noise adaptive optimization of matrix initialization for frequency-domain independent component analysis.
    Digital Signal Processing, vol.13, no.1, pp.1-8, 2013.
    [ paper ]

  43. Sugiyama, M. & Yamada, M.
    On kernel parameter selection in Hilbert-Schmidt independence criterion.
    IEICE Transactions on Information and Systems, vol.E95-D, no.10, pp.2564-2567, 2012.
    [ paper ]

  44. Simm, J., Sugiyama, M., & Hachiya, H.
    Multi-task approach to reinforcement learning for factored-state Markov decision problems.
    IEICE Transactions on Information and Systems, vol.E95-D, no.10, pp.2426-2437, 2012.
    [ paper ]

  45. Kurihara, N. & Sugiyama, M.
    Improving importance estimation in pool-based batch active learning for approximate linear regression.
    Neural Networks, vol.36, pp.73-82, 2012.
    [ paper ]

  46. Yamashita, A., Sugiyama, M., Kitagawa, K., & Kobayashi, H.
    Multiwavelength-integrated local model fitting method for interferometric surface profiling.
    Applied Optics, vol.51, no.28, pp.6700-6707, 2012.
    [ paper ]

  47. Kobayashi, T. & Sugiyama, M.
    Early stopping heuristics in pool-based incremental active learning for least-squares probabilistic classifier.
    IEICE Transactions on Information and Systems, vol.E95-D, no.8, pp.2065-2073, 2012.
    [ paper ]

  48. Sugiyama, M., Suzuki, T., & Kanamori, T.
    Density ratio matching under the Bregman divergence: A unified framework of density ratio estimation.
    Annals of the Institute of Statistical Mathematics, vol.64, no.5, pp.1009-1044, 2012.
    [ paper ]

  49. Karasuyama, M. & Sugiyama, M.
    Canonical dependency analysis based on squared-loss mutual information.
    Neural Networks, vol.34, pp.46-55, 2012.
    [ paper ]

  50. Feng, J., Wang, L., Sugiyama, M., Yang, C., Zhou, Z.-H., & Zhang, C.
    Boosting and margin theory.
    Frontiers of Electrical and Electronic Engineering, vol.7, no.1, pp.127-133, 2012.

  51. Karasuyama, M., Harada, N., Sugiyama, M., & Takeuchi, I.
    Multi-parametric solution-path algorithm for instance-weighted support vector machines.
    Machine Learning, vol.88, no.3, pp.297-330, 2012.
    [ paper ]

  52. Kawahara, Y. & Sugiyama, M.
    Sequential change-point detection based on direct density-ratio estimation.
    Statistical Analysis and Data Mining, vol.5, no.2, pp.114-127, 2012.
    [ paper ]

  53. Kitagawa, K., Tsuboi, T., Sugihara, H., Sugiyama, M., & Ogawa, H.
    Development of multi-wavelength single-shot interferometry and its practical application.
    Journal of the Japan Society for Precision Engineering, vol.78, no.2 pp.112-116, 2012.
    [ paper in Japanese ]

  54. Kanamori, T., Suzuki, T., & Sugiyama, M.
    Statistical analysis of kernel-based least-squares density-ratio estimation.
    Machine Learning, vol.86, no.3, pp.335-367, 2012.
    [ paper ]

  55. Kanamori, T., Suzuki, T., & Sugiyama, M.
    F-divergence estimation and two-sample homogeneity test under semiparametric density-ratio models.
    IEEE Transactions on Information Theory, vol.58, no.2, pp.708-720, 2012.
    [ paper ]

  56. Zhao, T., Hachiya, H., Niu, G., & Sugiyama, M.
    Analysis and improvement of policy gradient estimation.
    Neural Networks, vol.26, pp.118-129, 2012.
    [ paper ]

  57. Hachiya, H., Sugiyama, M., & Ueda, N.
    Importance-weighted least-squares probabilistic classifier for covariate shift adaptation with application to human activity recognition.
    Neurocomputing, vol.80, pp.93-101, 2012.
    [ paper ]

  58. Hachiya, H., Peters, J., & Sugiyama, M.
    Reward weighted regression with sample reuse for direct policy search in reinforcement learning.
    Neural Computation, vol.23, no.11, pp.2798-2832, 2011.
    [ paper ]

  59. Nakajima, S. & Sugiyama, M.
    Theoretical analysis of Bayesian matrix factorization.
    Journal of Machine Learning Research, vol.12 (Sep.), pp.2583-2648, 2011.
    [ paper ]

  60. Kimura, M. & Sugiyama, M.
    Dependence-maximization clustering with least-squares mutual information.
    Journal of Advanced Computational Intelligence and Intelligent Informatics, vol.15, no.7, pp.800-805, 2011.
    [ paper ]

  61. Mori, S., Sugiyama, M., Ogawa, H., Kitagawa, K., & Irie, K.
    Automatic parameter optimization of the local model fitting method for single-shot surface profiling.
    Applied Optics, vol.50, no.21, pp.3773-3780, 2011.
    [ paper ]

  62. Sugiyama, M., Suzuki, T., Itoh, Y., Kanamori, T., & Kimura, M.
    Least-squares two-sample test.
    Neural Networks, vol.24, no.7, pp.735-751, 2011.
    [ paper ]

  63. Krämer, N. & Sugiyama, M.
    The degrees of freedom of partial least squares regression.
    Journal of the American Statistical Association, vol.106, no.494, pp.697-705, 2011.
    (Based on this work, Nicole Krämer received the PLS Award from Addinsoft)
    [ paper ]

  64. Wang, L., Sugiyama, M., Jing, Z., Yang, C., Zhou, Z.-H., & Feng, J.
    A refined margin analysis for boosting algorithms via equilibrium margin.
    Journal of Machine Learning Research, vol.12 (Jun.), pp.1835-1863, 2011.
    [ paper ]

  65. Tomioka, R., Suzuki, T., & Sugiyama, M.
    Super-linear convergence of dual augmented Lagrangian algorithm for sparsity regularized estimation.
    Journal of Machine Learning Research, vol.12 (May), pp.1537-1586, 2011.
    [ paper ]

  66. Yamada, M., Sugiyama, M., Wichern, G., & Simm, J.
    Improving the accuracy of least-squares probabilistic classifiers.
    IEICE Transactions on Information and Systems, vol.E94-D, no.6, pp.1337-1340, 2011.
    [ paper ]

  67. Sugiyama, M. & Suzuki, T.
    Least-squares independence test.
    IEICE Transactions on Information and Systems, vol.E94-D, no.6, pp.1333-1336, 2011.
    [ paper ]

  68. Ueki, K., Sugiyama, M., & Ihara, Y.
    Lighting condition adaptation for perceived age estimation.
    IEICE Transactions on Information and Systems, vol.E94-D, no.2, pp.392-395, 2011.
    [ paper ]

  69. Simm, J., Sugiyama, M., & Kato, T.
    Computationally efficient multi-task learning with least-squares probabilistic classifiers.
    IPSJ Transactions on Computer Vision and Applications, vol.3, pp.1-8, 2011.
    [ paper ]

  70. Sugiyama, M., Yamada, M., von Bünau, P., Suzuki, T., Kanamori, T., & Kawanabe, M.
    Direct density-ratio estimation with dimensionality reduction via least-squares hetero-distributional subspace search.
    Neural Networks, vol.24, no.2, pp.183-198, 2011.
    [ paper ]

  71. Hido, S., Tsuboi, Y., Kashima, H., Sugiyama, M., & Kanamori, T.
    Statistical outlier detection using direct density ratio estimation.
    Knowledge and Information Systems, vol.26, no.2, pp.309-336, 2011.
    [ paper ]

  72. Rubens, N., Sheinman, V., Tomioka, R., & Sugiyama, M.
    Active learning in black-box settings.
    Austrian Journal of Statistics, vol. 40, no. 1-2, pp.125-135, 2011.

  73. Suzuki, T. & Sugiyama, M.
    Least-squares independent component analysis.
    Neural Computation, vol.23, no.1, pp.284-301, 2011.
    [ paper ]

  74. Sugiyama, M.
    A new approach to machine learning based on density ratios.
    Proceedings of the Institute of Statistical Mathematics, vol.58, no.2, pp.141-155, 2010.
    [ paper in Japanese ]

  75. Yamada, M., Sugiyama, M., Wichern, G., & Simm, J.
    Direct importance estimation with a mixture of probabilistic principal component analyzers.
    IEICE Transactions on Information and Systems, vol.E93-D, no.10, pp.2846-2849, 2010.

  76. Ueki, K., Sugiyama, M., & Ihara, Y.
    A semi-supervised approach to perceived age prediction from face images.
    IEICE Transactions on Information and Systems, vol.E93-D, no.10, pp.2875-2878, 2010.
    [ paper ]

  77. Sugiyama, M.
    Superfast-trainable multi-class probabilistic classifier by least-squares posterior fitting.
    IEICE Transactions on Information and Systems, vol.E93-D, no.10, pp.2690-2701, 2010.
    [ paper (revised version) ]

  78. Sugiyama, M., Hachiya, H., Kashima, H., & Morimura, T.
    Least absolute policy iteration---A robust approach to value function approximation.
    IEICE Transactions on Information and Systems, vol.E93-D, no.9, pp.2555-2565, 2010.
    [ paper ]

  79. Kurihara, N., Sugiyama, M., Ogawa, H., Kitagawa, K., & Suzuki, K.
    Iteratively-reweighted local model fitting method for adaptive and accurate single-shot surface profiling.
    Applied Optics, vol.49, no.22, pp.4270-4277, 2010.
    [ paper ]

  80. Kato, T., Kashima, H., Sugiyama, M., & Asai, K.
    Conic programming for multi-task learning.
    IEEE Transactions on Knowledge and Data Engineering, vol.22, no.7, pp.957-968, 2010.
    [ paper ]

  81. Li, Y., Kambara, H., Koike, Y., & Sugiyama, M.
    Application of covariate shift adaptation techniques in brain computer interfaces.
    IEEE Transactions on Biomedical Engineering, vol.57, no.6, pp.1318-1324, 2010.
    [ paper ]

  82. Shimizu, N., Sugiyama, M., & Nakagawa, H.
    Spectral methods for thesaurus construction.
    IEICE Transactions on Information and Systems, vol.E93-D, no.6, pp.1378-1385, 2010.
    [ paper ]

  83. Akiyama, T., Hachiya, H., & Sugiyama, M.
    Efficient exploration through active learning for value function approximation in reinforcement learning.
    Neural Networks, vol.23, no.5, pp.639-648, 2010.
    [ paper, demo (wmv) ]

  84. Yamada, M., Sugiyama, M., & Matsui, T.
    Semi-supervised speaker identification under covariate shift.
    Signal Processing, vol.90, no.8, pp.2353-2361, 2010.
    [ paper ]

  85. Kanamori, T., Suzuki, T., & Sugiyama, M.
    Theoretical analysis of density ratio estimation.
    IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol.E93-A, no.4, pp.787-798, 2010.
    [ paper ]

  86. Kato, T., Okada, K., Kashima, H., & Sugiyama, M.
    A transfer learning approach and selective integration of multiple types of assays for biological network inference.
    International Journal of Knowledge Discovery in Bioinformatics, vol.1, no.1, pp.66-80, 2010.
    [ paper ]

  87. Sugiyama, M., Takeuchi, I., Kanamori, T., Suzuki, T., Hachiya, H., & Okanohara, D.
    Least-squares conditional density estimation.
    IEICE Transactions on Information and Systems, vol.E93-D, no.3, pp.583-594, 2010.
    [ paper ]

  88. Sugiyama, M., Idé, T., Nakajima, S., & Sese, J.
    Semi-supervised local Fisher discriminant analysis for dimensionality reduction.
    Machine Learning, vol.78, no.1-2, pp.35-61, 2010.
    [ paper ]

  89. Sugiyama, M., Kawanabe, M., & Chui, P. L.
    Dimensionality reduction for density ratio estimation in high-dimensional spaces.
    Neural Networks, vol.23, no.1, pp.44-59, 2010.
    (This paper was selected for 2011 JNNS Best Paper Award)
    [ paper ]

  90. Rubens, N., Tomioka, R., & Sugiyama, M.
    Output divergence criterion for active learning in collaborative settings.
    IPSJ Transactions on Mathematical Modeling and Its Applications, vol.2, no.3, pp.87-96, 2009.
    [ paper ]

  91. Hachiya, H., Akiyama, T., Sugiyama, M., & Peters, J.
    Adaptive importance sampling for value function approximation in off-policy reinforcement learning.
    Neural Networks, vol.22, no.10, pp.1399-1410, 2009.
    [ paper ]

  92. Naito, T., Sugiyama, M., Ogawa, H., Kitagawa, K., & Suzuki, K.,
    Single-shot interferometry of film-covered objects.
    Journal of the Japan Society for Precision Engineering, vol.75, no.11, pp.1315-1322, 2009.
    [ paper in Japanese ]

  93. Kashima, H., Kato, T., Yamanishi, Y., Sugiyama, M., & Tsuda, K.
    Simultaneous inference of biological networks of multiple species from genome-wide data and evolutionary information: A semi-supervised approach.
    Bioinformatics, vol.25, no.22, pp.2962-2968, 2009.
    [ paper ]

  94. Tomioka, R. & Sugiyama, M.
    Dual augmented Lagrangian method for efficient sparse reconstruction.
    IEEE Signal Processing Letters, vol.16, no.2, pp.1067-1070, 2009.
    [ paper ]

  95. Yamada, M. & Sugiyama, M.
    Direct importance estimation with Gaussian mixture models.
    IEICE Transactions on Information and Systems, vol.E92-D, no.10, pp.2159-2162, 2009.
    [ paper ]

  96. Sugiyama, M., Kanamori, T., Suzuki, T., Hido, S., Sese, J., Takeuchi, I., & Wang, L.
    A density-ratio framework for statistical data processing.
    IPSJ Transactions on Computer Vision and Applications, vol.1, pp.183-208, 2009.
    [ paper ]

  97. Kanamori, T., Hido, S., & Sugiyama, M.
    A least-squares approach to direct importance estimation.
    Journal of Machine Learning Research, vol.10 (Jul.), pp.1391-1445, 2009.
    [ paper, software (html) ]

  98. Takeda, A. & Sugiyama, M.
    On generalization performance and non-convex optimization of extended nu-support vector machine.
    New Generation Computing, vol.27, no.3, pp.259-279, 2009.
    [ paper ]

  99. Kashima, H., Idé, T., Kato, T., & Sugiyama, M.
    Recent advances and trends in large-scale kernel methods.
    IEICE Transactions on Information and Systems, vol.E92-D, no.7, pp.1338-1353, 2009.
    [ paper ]

  100. Yokota, T., Sugiyama, M., Ogawa, H., Kitagawa, K., & Suzuki, K.
    The interpolated local model fitting method for accurate and fast single-shot surface profiling.
    Applied Optics, vol.48, no.18, pp.3497-3508, 2009.
    [ paper (revised version) ]

  101. Sugiyama, M. & Nakajima, S.
    Pool-based active learning in approximate linear regression.
    Machine Learning, vol.75, no.3, pp.249-274, 2009.
    [ paper ]

  102. Sugiyama, M.
    On computational issues of semi-supervised local Fisher discriminant analysis.
    IEICE Transactions on Information and Systems, vol.E92-D, no.5, pp.1204-1208, 2009.
    [ paper ]

  103. Wang, L., Sugiyama, M., Yang, C., Hatano, K., & Feng, J.
    Theory and algorithm for learning with dissimilarity functions.
    Neural Computation, vol.21, no.5, pp.1459-1484, 2009.
    [ paper ]

  104. Tsuboi, Y., Kashima, H., Hido, S., Bickel, S., & Sugiyama, M.
    Direct density ratio estimation for large-scale covariate shift adaptation.
    Journal of Information Processing, vol.17, pp.138-155, 2009.
    [ paper ]

  105. Ogawa, H., Nakanowatari, A., Kitagawa, K., Sugiyama, M., & Naito, T.
    Simultaneous measurement of film thickness and surface profile of film-covered objects by monochromatic light interferometry.
    Transactions of the Society of Instrument and Control Engineers, vol.45, no.2, pp.73-82, 2009.
    [ paper in Japanese ]

  106. Kitagawa, K., Sugiyama, M., Matsuzaka, T., Ogawa, H., & Suzuki, K.,
    Two-wavelength single-shot interferometry.
    Journal of the Japan Society for Precision Engineering, vol.75, no.2, pp.273-277, 2009.
    [ paper in Japanese ]

  107. Suzuki, T., Sugiyama, M., Kanamori, T., & Sese, J.
    Mutual information estimation reveals global associations between stimuli and biological processes.
    BMC Bioinformatics, vol.10, no.1, pp.S52 (12 pages), 2009.
    [ paper ]

  108. Kato, T., Kashima, H., & Sugiyama, M.
    Robust label propagation on multiple networks.
    IEEE Transactions on Neural Networks, vol.20, no.1, pp.35-44, 2009.
    [ paper ]

  109. Sugiyama, M. & Rubens, N.
    A batch ensemble approach to active learning with model selection.
    Neural Networks, vol.21, no.9, pp.1278-1286, 2008.
    [ paper ]

  110. Sugiyama, M., Suzuki, T., Nakajima, S., Kashima, H., von Bünau, P., & Kawanabe, M.
    Direct importance estimation for covariate shift adaptation.
    Annals of the Institute of Statistical Mathematics, vol.60, no.4, pp.699-746, 2008.
    [ paper ]

  111. Jankovic, M. V. & Sugiyama, M.
    A multipurpose linear component analysis method based on modulated Hebb Oja learning rule.
    IEEE Signal Processing Letters, vol.15, pp.677-680, 2008.
    [ paper ]

  112. Sugiyama, M., Hachiya, H., Towell, C., & Vijayakumar, S.
    Geodesic Gaussian kernels for value function approximation.
    Autonomous Robots, vol.25, no.3, pp.287-304, 2008.
    [ paper, demo (wmv) ]

  113. Sugiyama, M., Kawanabe, M., Blanchard, G., & Müller, K.-R.
    Approximating the best linear unbiased estimator of non-Gaussian signals with Gaussian noise.
    IEICE Transactions on Information and Systems, vol.E91-D, no.5, pp.1577-1580, 2008.
    [ paper ]

  114. Gokita, S., Sugiyama, M., & Sakurai, K.
    Analytic optimization of adaptive ridge parameters based on regularized subspace information criterion.
    IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol.E90-A, no.11, pp.2584-2592, 2007.
    [ paper ]

  115. Hidaka, Y. & Sugiyama, M.
    A new meta-criterion for regularized subspace information criterion.
    IEICE Transactions on Information and Systems, vol.E90-D, no.11, pp.1779-1786, 2007.
    [ paper ]

  116. Sugiyama, M.
    Generalization error estimation for non-linear learning methods.
    IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol.E90-A, no.7, pp.1496-1499, 2007.
    [ paper (revised version) ]

  117. Sugiyama, M.
    Dimensionality reduction of multimodal labeled data by local Fisher discriminant analysis.
    Journal of Machine Learning Research, vol.8 (May), pp.1027-1061, 2007.
    [ paper, software (html) ]

  118. Sugiyama, M., Krauledat, M., & Müller, K.-R.
    Covariate shift adaptation by importance weighted cross validation.
    Journal of Machine Learning Research, vol.8 (May), pp.985-1005, 2007.
    [ paper ]

  119. Kawanabe, M., Sugiyama, M., Blanchard, G., & Müller, K.-R.
    A new algorithm of non-Gaussian component analysis with radial kernel functions.
    Annals of the Institute of Statistical Mathematics, vol.59, no.1, pp.57-75, 2007.
    [ paper ]

  120. Ogawa, H., Shimoyama, K., Fukunaga, M., Kitagawa, K., & Sugiyama, M.
    Simultaneous measurement of film thickness and surface profile of film-covered objects by using white-light interferometry.
    Transactions of the Society of Instrument and Control Engineers, vol.43, no.2, pp.71-77, 2007.
    [ paper in Japanese ]

  121. Sugiyama, M., Ogawa, H., Kitagawa, K., & Suzuki, K.
    Single-shot surface profiling by local model fitting.
    Applied Optics, vol.45, no.31, pp.7999-8005, 2006.
    [ paper ]

  122. Sugiyama, M. & Sakurai, K.
    Analytic optimization of shrinkage parameters based on regularized subspace information criterion.
    IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol.E89-A, no.8, pp.2216-2225, 2006.
    [ paper ]

  123. Sugiyama, M. & Ogawa, H.
    Constructing kernel functions for binary regression.
    IEICE Transactions on Information and Systems, vol.E89-D, no.7, pp.2243-2249, 2006.
    [ paper ]

  124. Blanchard, G., Kawanabe, M., Sugiyama, M., Spokoiny, V., & Müller, K.-R.
    In search of non-Gaussian components of a high-dimensional distribution.
    Journal of Machine Learning Research, vol.7 (Feb), pp.277-282, 2006.
    [ paper ]

  125. Sugiyama, M.
    Active learning in approximately linear regression based on conditional expectation of generalization error.
    Journal of Machine Learning Research, vol.7 (Jan), pp.141-166, 2006.
    [ paper ]

  126. Sugiyama, M. & Müller, K.-R.
    Input-dependent estimation of generalization error under covariate shift.
    Statistics & Decisions, vol.23, no.4, pp.249-279, 2005.
    [ paper ]

  127. Sugiyama, M., Kawanabe, M., & Müller, K.-R.
    Trading variance reduction with unbiasedness: The regularized subspace information criterion for robust model selection in kernel regression.
    Neural Computation, vol.16, no.5, pp.1077-1104, 2004.
    [ paper ]

  128. Sugiyama, M., Okabe, Y., & Ogawa, H.
    Perturbation analysis of a generalization error estimator.
    Neural Information Processing - Letters and Reviews, vol.2, no.2, pp.33-38, Feb. 2004.
    [ paper ]

  129. Sugiyama, M. & Ogawa, H.
    Active learning with model selection---Simultaneous optimization of sample points and models for trigonometric polynomial models.
    IEICE Transactions on Information and Systems, vol.E86-D, no.12, pp.2753-2763, Dec. 2003.
    [ paper ]

  130. Sugiyama, M.
    Improving precision of the subspace information criterion.
    IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol.E86-A, no.7, pp.1885-1895, Jul. 2003.
    [ paper ]

  131. Sugiyama, M. & Müller, K.-R.
    The subspace information criterion for infinite dimensional hypothesis spaces.
    Journal of Machine Learning Research, vol.3 (Nov), pp.323-359, 2002.
    [ paper ]

  132. Sugiyama, M. & Ogawa, H.
    A unified method for optimizing linear image restoration filters.
    Signal Processing, vol.82, no.11, pp.1773-1787, 2002.
    [ paper ]

  133. Sugiyama, M. & Ogawa, H.
    Incremental construction of projection generalizing neural networks.
    IEICE Transactions on Information and Systems, vol.E85-D, no.9, pp.1433-1442, Sep. 2002.
    [ paper ]

  134. Sugiyama, M. & Ogawa, H.
    Optimal design of regularization term and regularization parameter by subspace information criterion.
    Neural Networks, vol.15, no.3, pp.349-361, 2002.
    [ paper ]

  135. Tsuda, K., Sugiyama, M., & Müller, K.-R.
    Subspace information criterion for non-quadratic regularizers---Model selection for sparse regressors.
    IEEE Transactions on Neural Networks, vol.13, no.1, pp.70-80, 2002.
    [ paper ]

    [Japanese Version]
    Tsuda, K., Sugiyama, M., & Müller, K.-R.
    Subspace information criterion for sparse regressors.
    IEICE Transactions on Information and Systems, vol.J85-D-II, no.5, pp.766-775, May 2002.
    [ paper in Japanese ]

  136. Sugiyama, M. & Ogawa, H.
    Theoretical and experimental evaluation of the subspace information criterion.
    Machine Learning, vol.48, no.1/2/3, pp.25-50, 2002.
    [ paper ]

  137. Sugiyama, M., Imaizumi, D., & Ogawa, H.
    Subspace information criterion for image restoration---Optimizing parameters in linear filters.
    IEICE Transactions on Information and Systems, vol.E84-D, no.9, pp.1249-1256, Sep. 2001.
    (This paper was selected for 2002 Niwa Memorial Award)
    [ paper ]

  138. Sugiyama, M. & Ogawa, H.
    Active learning for optimal generalization in trigonometric polynomial models.
    IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences, vol.E84-A, no.9, pp.2319-2329, Sep. 2001.
    [ paper ]

  139. Sugiyama, M. & Ogawa, H.
    Subspace information criterion for model selection.
    Neural Computation, vol.13, no.8, pp.1863-1889, 2001.
    [ paper ]

  140. Sugiyama, M. & Ogawa, H.
    Incremental projection learning for optimal generalization.
    Neural Networks, vol.14, no.1, pp.53-66, 2001.
    [ paper (revised version) ]

  141. Sugiyama, M. & Ogawa, H.
    Properties of incremental projection learning.
    Neural Networks, vol.14, no.1, pp.67-78, 2001.
    [ paper (revised version) ]

  142. Sugiyama, M. & Ogawa, H.
    Incremental active learning for optimal generalization.
    Neural Computation, vol.12, no.12, pp.2909-2940, 2000.
    [ paper ]


Major Conferences

  1. Sasaki, H., Hyvärinen, A., & Sugiyama, M.
    Clustering via mode seeking by direct estimation of the gradient of a log-density.
    In X. XXX, Y. YYY, and Z. ZZZ (Eds.), Machine Learning and Knowledge Discovery in Databases, Part xxx, Lecture Notes in Computer Science, vol.xxxx, pp.xxxx-xxxx, Berlin, Springer, 2014.
    (Presented at the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD2014), Nancy, France, Sep. 15-19, 2014)
    [ paper ]

  2. Ma, Y., Zhao, T., Hatano, K., & Sugiyama, M.
    An online policy gradient algorithm for continuous state and action Markov decision processes.
    In X. XXX, Y. YYY, and Z. ZZZ (Eds.), Machine Learning and Knowledge Discovery in Databases, Part xxx, Lecture Notes in Computer Science, vol.xxxx, pp.xxxx-xxxx, Berlin, Springer, 2014.
    (Presented at the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD2014), Nancy, France, Sep. 15-19, 2014)
    [ paper ]

  3. Niu, G., Dai, B., du Plessis, M. C., & Sugiyama, M.
    Transductive learning with multi-class volume approximation.
    In E. Xing and T. Jebara (Eds.), Proceedings of 31st International Conference on Machine Learning (ICML2014), JMLR Workshop and Conference Proceedings, vol.32 pp.xxx-xxx, Beijing, China, Jun. 21-26, 2014.
    [ paper ]

  4. Suzumura, S., Ogawa, K., Sugiyama, M., & Takeuchi, I.
    Outlier path: A homotopy algorithm for robust SVM.
    In E. Xing and T. Jebara (Eds.), Proceedings of 31st International Conference on Machine Learning (ICML2014), JMLR Workshop and Conference Proceedings, vol.32, pp.xxx-xxx, Beijing, China, Jun. 21-26, 2014.
    [ paper ]

  5. Nakajima, S. & Sugiyama, M.
    Analysis of empirical MAP and empirical partially Bayes: Can they be alternatives to variational Bayes?
    In S. Kaski and J. Corander (Eds.), Proceedings of Seventeenth International Conference on Artificial Intelligence and Statistics (AISTATS2014), JMLR Workshop and Conference Proceedings, vol.33, pp.20-28, Reykjavik, Iceland, Apr. 22-24, 2014.
    (This paper was selected for Notable Paper Award)
    [ paper ]

  6. Noh, Y.-K., Sugiyama, M., Liu, S., du Plessis, M. C., Park, F. C., & Lee, D. D.
    Bias reduction and metric learning for nearest-neighbor estimation of Kullback-Leibler divergence.
    In S. Kaski and J. Corander (Eds.), Proceedings of Seventeenth International Conference on Artificial Intelligence and Statistics (AISTATS2014), JMLR Workshop and Conference Proceedings, vol.33, pp.669-677, Reykjavik, Iceland, Apr. 22-24, 2014.
    [ paper ]

  7. du Plessis, M. C., Niu, G., & Sugiyama, M.
    Clustering unclustered data: Unsupervised binary labeling of two datasets having different class balances.
    In Proceedings of Conference on Technologies and Applications of Artificial Intelligence (TAAI2013), pp.1-6, Taipei, Taiwan, Dec. 6-8, 2013.
    (This paper was selected for Best Paper Award)
    [ paper ]

  8. Nakajima, S., Takeda, A., Babacan, D., Sugiyama, M., & Takeuchi, I.
    Global solver and its efficient approximation for variational Bayesian low-rank subspace clustering.
    In C. J. C. Burges, L. Bottou, M. Welling, Z. Ghahramani, and K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 26, pp.1439-1447, 2013.
    (Presented at Neural Information Processing Systems (NIPS2013), Lake Tahoe, Nevada, USA, Dec. 5-8, 2013)
    [ paper ]

  9. Takeuchi, I., Hongo, T., Sugiyama, M., & Nakajima, S.
    Parametric task learning.
    In C. J. C. Burges and L. Bottou and M. Welling and Z. Ghahramani and K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 26, pp.1358-1366, 2013.
    (Presented at Neural Information Processing Systems (NIPS2013), Lake Tahoe, Nevada, USA, Dec. 5-8, 2013)
    [ paper ]

  10. Liu, S., Quinn, J., Gutmann, M. U., & Sugiyama, M.
    Direct learning of sparse changes in Markov networks by density ratio estimation.
    In H. Blockeel, K. Kersting, S. Nijssen, and F. Železný (Eds.), Machine Learning and Knowledge Discovery in Databases, Part II, Lecture Notes in Computer Science, vol.8189, pp.596-611, Berlin, Springer, 2013.
    (Presented at the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD2013), Prague, Czech Republic, Sep. 23-27, 2013)
    [ paper ]

  11. Niu, G., Jitkrittum, W., Dai, B., Hachiya, H., & Sugiyama, M.
    Squared-loss mutual information regularization.
    In S. Dasgupta and D. McAllester (Eds.), Proceedings of 30th International Conference on Machine Learning (ICML2013), JMLR Workshop and Conference Proceedings, vol.28, pp.10-18, Atlanta, Georgia, USA, Jun. 16-21, 2013.
    [ paper ]

  12. Ogawa, K., Imamura, M., Takeuchi, I., & Sugiyama, M.
    Infinitesimal annealing for training semi-supervised support vector machines.
    In S. Dasgupta and D. McAllester (Eds.), Proceedings of 30th International Conference on Machine Learning (ICML2013), JMLR Workshop and Conference Proceedings, vol.28, pp.897-905, Atlanta, Georgia, USA, Jun. 16-21, 2013.
    [ paper ]

  13. Sugiyama, M., Suzuki, T., Kanamori, T., du Plessis, M. C., Liu, S., & Takeuchi, I.
    Density-difference estimation.
    In P. Bartlett, F. C. N. Pereira, C. J. C. Burges, L. Bottou, and K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 25, pp.692-700, 2012.
    (Presented at Neural Information Processing Systems (NIPS2012), Lake Tahoe, Nevada, USA, Dec. 3-6, 2012)
    [ paper, poster ]

  14. Nakajima, S., Tomioka, R., Sugiyama, M., & Babacan, D.
    Perfect dimensionality recovery by variational Bayesian PCA.
    In P. Bartlett, F. C. N. Pereira, C. J. C. Burges, L. Bottou, and K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 25, pp.980-988, 2012.
    (Presented at Neural Information Processing Systems (NIPS2012), Lake Tahoe, Nevada, USA, Dec. 3-6, 2012)
    [ paper ]

  15. Kimura, A., Sugiyama, M., Kameoka, H., & Sakano, H.
    Designing various component analysis at will.
    In Proceedings of 21st International Conference on Pattern Recognition (ICPR2012), pp.2959-2962, Tsukuba, Japan, Nov. 11-15, 2012.
    [ paper ]

  16. Liu, S., Yamada, M., Collier, N., & Sugiyama, M.
    Change-point detection in time-series data by relative density-ratio estimation.
    In G. Gimel'farb, E. Hancock, A. Imiya, A. Kuijper, M. Kudo, S. Omachi, T. Windeatt, and K Yamada (Eds.), Structural, Syntactic, and Statistical Pattern Recognition, Lecture Notes in Computer Science, vol.7626, pp.363-372, Berlin, Springer, 2012.
    (Presented at 9th International Workshop on Statistical Techniques in Pattern Recognition (SPR2012), Hiroshima, Japan, Nov. 7-9, 2012)
    [ paper ]

  17. Nakajima, S., Sugiyama, M., & Babacan, D.
    Sparse additive matrix factorization for robust PCA and its generalization.
    In S. C. H. Hoi and W. Buntine (Eds.), Proceedings of the Fourth Asian Conference on Machine Learning (ACML2012), JMLR Workshop and Conference Proceedings, vol.25, pp.301-316, Singapore, Nov. 4-6, 2012.
    [ paper ]

  18. Niu, G., Dai, B., Yamada, M., & Sugiyama, M.
    Information-theoretic semi-supervised metric learning via entropy regularization.
    In J. Langford and J. Pineau (Eds.), Proceedings of 29th International Conference on Machine Learning (ICML2012), pp.89-96, Edinburgh, Scotland, Jun. 26-Jul. 1, 2012.
    [ paper ]

  19. Xie, N., Hachiya, H., & Sugiyama, M.
    Artist agent: A reinforcement learning approach to automatic stroke generation in oriental ink painting.
    In J. Langford and J. Pineau (Eds.), Proceedings of 29th International Conference on Machine Learning (ICML2012), pp.153-160, Edinburgh, Scotland, Jun. 26-Jul. 1, 2012.
    [ paper, demo (mp4), external review 1, external review 2 ]

  20. du Plessis, M. C. & Sugiyama, M.
    Semi-supervised learning of class balance under class-prior change by distribution matching.
    In J. Langford and J. Pineau (Eds.), Proceedings of 29th International Conference on Machine Learning (ICML2012), pp.823-830, Edinburgh, Scotland, Jun. 26-Jul. 1, 2012.
    [ paper ]

  21. Suzuki, T. & Sugiyama, M.
    Fast learning rate of multiple kernel learning: Trade-off between sparsity and smoothness.
    In N. Lawrence and M. Girolami (Eds.), Proceedings of Fifteenth International Conference on Artificial Intelligence and Statistics (AISTATS2012), JMLR Workshop and Conference Proceedings, vol.22, pp.1152-1183, La Palma, Canary Islands, Apr. 21-23, 2012.
    [ paper ]

  22. Sugiyama, M., Hachiya, H., Yamada, M., Simm, J., & Nam, H.
    Least-squares probabilistic classifier: A computationally efficient alternative to kernel logistic regression.
    In Proceedings of International Workshop on Statistical Machine Learning for Speech Processing (IWSML2012), pp.1-10, Kyoto, Japan, Mar. 31, 2012.
    [ paper ]

  23. Nam, H., Hachiya, H., & Sugiyama, M.
    Computationally efficient multi-label classification by least-squares probabilistic classifier.
    In Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP2012), pp.2077-2080, Kyoto, Japan, Mar. 25-30, 2012.
    [ paper ]

  24. Yamada, M., Suzuki, T., Kanamori, T., Hachiya, H., & Sugiyama, M.
    Relative density-ratio estimation for robust distribution comparison.
    In J. Shawe-Taylor, R. S. Zemel, P. Bartlett, F. C. N. Pereira, and K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 24, pp.594-602, 2011.
    (Presented at Neural Information Processing Systems (NIPS2011), Granada, Spain, Dec. 13-15, 2011)
    [ paper ]

  25. Zhao, T., Hachiya, H., Niu, G., & Sugiyama, M.
    Analysis and improvement of policy gradient estimation.
    In J. Shawe-Taylor, R. S. Zemel, P. Bartlett, F. C. N. Pereira, and K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 24, pp.262-270, 2011.
    (Presented at Neural Information Processing Systems (NIPS2011), Granada, Spain, Dec. 13-15, 2011)
    [ paper ]

  26. Nakajima, S., Sugiyama, M., & Babacan, D.
    Global solution of fully-observed variational Bayesian matrix factorization is column-wise independent.
    In J. Shawe-Taylor, R. S. Zemel, P. Bartlett, F. C. N. Pereira, and K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 24, pp.208-216, 2011.
    (Presented at Neural Information Processing Systems (NIPS2011), Granada, Spain, Dec. 13-15, 2011)
    [ paper ]

  27. Takeuchi, I. & Sugiyama, M.
    Target neighbor consistent feature weighting for nearest neighbor classification.
    In J. Shawe-Taylor, R. S. Zemel, P. Bartlett, F. C. N. Pereira, and K. Q. Weinberger (Eds.), Advances in Neural Information Processing Systems 24, pp.576-584, 2011.
    (Presented at Neural Information Processing Systems (NIPS2011), Granada, Spain, Dec. 13-15, 2011)
    [ paper ]

  28. Ueki, K., Sugiyama, M., Ihara, Y., & Fujita, M.
    Multi-race age estimation based on the combination of multiple classifiers.
    In Proceedings of the First Asian Conference on Pattern Recognition (ACPR2011), pp.633-637, Beijing, China, Nov. 28-30, 2011.
    [ paper ]

  29. Yamada, M., Niu, G., Takagi, J., & Sugiyama, M.
    Computationally efficient sufficient dimension reduction via squared-loss mutual information.
    In C.-N. Hsu and W. S. Lee (Eds.), Proceedings of the Third Asian Conference on Machine Learning (ACML2011), JMLR Workshop and Conference Proceedings, vol.20, pp.247-262, Taoyuan, Taiwan, Nov. 13-15, 2011.
    [ paper ]

  30. Matsugu, M., Yamanaka, M., & Sugiyama, M.
    Detection of activities and events without explicit categorization.
    In Proceedings of Proceedings of the 3rd International Workshop on Video Event Categorization, Tagging and Retrieval for Real-World Applications (VECTaR2011), pp.1532-1539, Barcelona, Spain, Nov. 13, 2011.
    [ paper ]

  31. Karasuyama, M., Harada, N., Sugiyama, M., & Takeuchi, I.
    Multi-parametric solution-path algorithm for instance-weighted support vector machines.
    In Proceedings of IEEE International Workshop on Machine Learning for Signal Processing (MLSP2011), pp.1-6, Beijing, China, Sep. 18-21, 2011.
    [ paper ]

  32. Yamada, M. & Sugiyama, M.
    Direct density-ratio estimation with dimensionality reduction via hetero-distributional subspace analysis.
    In Proceedings of the Twenty-Fifth AAAI Conference on Artificial Intelligence (AAAI2011), pp.549-554, San Francisco, California, USA, Aug. 7-11, 2011.
    [ paper ]

  33. Ide, T. & Sugiyama, M.
    Trajectory regression on road networks.
    In Proceedings of the Twenty-Fifth AAAI Conference on Artificial Intelligence (AAAI2011), pp.203-208, San Francisco, California, USA, Aug. 7-11, 2011.
    [ paper ]

  34. Sugiyama, M., Yamada, M., Kimura, M., & Hachiya, H.
    On information-maximization clustering: Tuning parameter selection and analytic solution.
    In L. Getoor and T. Scheffer (Eds.), Proceedings of 28th International Conference on Machine Learning (ICML2011), pp.65-72, Bellevue, Washington, USA, Jun. 28-Jul. 2, 2011.
    [ paper, slides ]

  35. Nakajima, S., Sugiyama, M., & Babacan, D.
    On Bayesian PCA: Automatic dimensionality selection and analytic solution.
    In L. Getoor and T. Scheffer (Eds.), Proceedings of 28th International Conference on Machine Learning (ICML2011), pp.497-504, Bellevue, Washington, USA, Jun. 28-Jul. 2, 2011.
    [ paper ]

  36. Takagi, J., Ohishi, Y., Kimura, A., Sugiyama, M., Yamada, M., & Kameoka, H.
    Automatic audio tag classification via semi-supervised canonical density estimation.
    In Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP2011), pp.2232-2235, Prague, Czech Republic, May 22-27, 2011.
    [ paper ]

  37. Yamada, M. & Sugiyama, M.
    Cross-domain object matching with model selection.
    In G. Gordon, D. Dunson, and M. Dudík (Eds.), Proceedings of Fourteenth International Conference on Artificial Intelligence and Statistics (AISTATS2011), JMLR Workshop and Conference Proceedings, vol.15, pp.807-815, Fort Lauderdale, Florida, USA, Apr. 11-13, 2011.
    [ paper ]

  38. Niu, G., Dai, B., Shang, L., & Sugiyama, M.
    Maximum volume clustering.
    In G. Gordon, D. Dunson, and M. Dudík (Eds.), Proceedings of Fourteenth International Conference on Artificial Intelligence and Statistics (AISTATS2011), JMLR Workshop and Conference Proceedings, vol.15, pp.561-569, Fort Lauderdale, Florida, USA, Apr. 11-13, 2011.
    [ paper ]

  39. Nakajima, S., Sugiyama, M., & Tomioka, M.
    Global analytic solution for variational Bayesian matrix factorization.
    In J. Lafferty, C. K. I. Williams, R. Zemel, J. Shawe-Taylor, and A. Culotta (Eds.), Advances in Neural Information Processing Systems 23, pp.1759-1767, 2010.
    (Presented at Neural Information Processing Systems (NIPS2010), Vancouver, British Columbia, Canada, Dec. 6-11, 2010)
    [ paper ]

  40. Tabei Y., Uno, T., Sugiyama, M., & Tsuda, K.
    Single versus multiple sorting in all pairs similarity search.
    In M. Sugiyama and Q. Yang (Eds.), Proceedings of the Second Asian Conference on Machine Learning (ACML2010), JMLR Workshop and Conference Proceedings, vol.13, pp.145-160, Tokyo, Japan, Nov. 8-10, 2010.
    [ paper ]

  41. Hachiya, H. & Sugiyama, M.
    Feature selection for reinforcement learning: Evaluating implicit state-reward dependency via conditional mutual information.
    In J. L. Balcázar and F. Bonchi, A. Gionis, and M. Sebag (Eds.), Machine Learning and Knowledge Discovery in Databases, Part I, Lecture Notes in Computer Science, vol.6321, pp.474-489, Berlin, Springer, 2010.
    (Presented at the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD2010), Barcelona, Spain, Sep. 20-24, 2010)
    [ paper ]

  42. Sugiyama, M. & Simm, J.
    A computationally-efficient alternative to kernel logistic regression.
    In S. Kaski, D. J. Miller, E. Oja, and A. Honkela (Eds.), Proceedings of IEEE International Workshop on Machine Learning for Signal Processing (MLSP2010), pp.124-129, Kittilä, Finland, Aug. 29-Sep. 1, 2010.

  43. Takeda, A., Gotoh, J., & Sugiyama, M.
    Support vector regression as conditional value-at-risk minimization with application to financial time-series analysis.
    In S. Kaski, D. J. Miller, E. Oja, and A. Honkela (Eds.), Proceedings of IEEE International Workshop on Machine Learning for Signal Processing (MLSP2010), pp.118-123, Kittilä, Finland, Aug. 29-Sep. 1, 2010.
    [ paper ]

  44. Ueki, K., Sugiyama, M., & Ihara, Y.
    Perceived age estimation under lighting condition change by covariate shift adaptation.
    In Proceedings of 20th International Conference on Pattern Recognition (ICPR2010), pp.3400-3403, Istanbul, Turkey, Aug. 23-26, 2010.
    [ paper ]

  45. Kimura, A., Kameoka, H., Sugiyama, M., Maeda, E., Sakano, H., & Ishiguro, K.
    SemiCCA: Efficient semi-supervised learning of canonical correlations.
    In Proceedings of 20th International Conference on Pattern Recognition (ICPR2010), pp.2933-2936, Istanbul, Turkey, Aug. 23-26, 2010.
    [ paper ]

  46. Yamada, M. & Sugiyama, M.
    Dependence minimizing regression with model selection for non-linear causal inference under non-Gaussian noise.
    In Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence (AAAI2010), pp.643-648, Atlanta, Georgia, USA, Jul. 11-15, 2010.
    [ paper ]

  47. Morimura, T., Sugiyama, M., Kashima, H., Hachiya, H., & Tanaka, T.
    Parametric return density estimation for reinforcement learning.
    In Proceedings of the 26th Conference on Uncertainty in Artificial Intelligence (UAI2010), pp.368-375, Catalina Island, California, USA, Jul. 8-11, 2010.
    [ paper ]

  48. Tomioka, R., Suzuki, T., Sugiyama, M., & Kashima, H.
    An efficient and general augmented Lagrangian algorithm for learning low-rank matrices.
    In T. Joachims and J. Fürnkranz (Eds.), Proceedings of 27th International Conference on Machine Learning (ICML2010), pp. 1087-1094, Haifa, Israel, Jun. 21-25, 2010.
    [ paper ]

  49. Nakajima, S. & Sugiyama, M.
    On non-identifiability of Bayesian matrix factorization models.
    In T. Joachims and J. Fürnkranz (Eds.), Proceedings of 27th International Conference on Machine Learning (ICML2010), pp.815-822, Haifa, Israel, Jun. 21-25, 2010.
    [ paper ]

  50. Morimura, T., Sugiyama, M., Kashima, H., Hachiya, H., & Tanaka, T.
    Nonparametric return distribution approximation for reinforcement learning.
    In T. Joachims and J. Fürnkranz (Eds.), Proceedings of 27th International Conference on Machine Learning (ICML2010), pp.799-806, Haifa, Israel, Jun. 21-25, 2010.
    [ paper ]

  51. Ueki, K., Sugiyama, M., & Ihara, Y.
    Semi-supervised estimation of perceived age from face images.
    In Proceedings of International Conference on Computer Vision Theory and Applications (VISAPP2010), pp.319-324, Angers, France, May 17-21, 2010.
    [ paper ]

  52. Suzuki, T. & Sugiyama, M.
    Sufficient dimension reduction via squared-loss mutual information estimation.
    In Proceedings of Thirteenth International Conference on Artificial Intelligence and Statistics (AISTATS2010), JMLR Workshop and Conference Proceedings, vol.9, pp.804-811, Sardinia, Italy, May 13-15, 2010.
    [ paper ]

  53. Sugiyama, M., Takeuchi, I., Kanamori, T., Suzuki, T., Hachiya, H., & Okanohara, D.
    Conditional density estimation via least-squares density ratio estimation.
    In Proceedings of Thirteenth International Conference on Artificial Intelligence and Statistics (AISTATS2010), JMLR Workshop and Conference Proceedings, vol.9, pp.781-788, Sardinia, Italy, May 13-15, 2010.
    [ paper ]

  54. Sugiyama, M., Hara, S., von Bünau, P., Suzuki, T., Kanamori, T., & Kawanabe, M.
    Direct density ratio estimation with dimensionality reduction.
    In S. Parthasarathy, B. Liu, B. Goethals, J. Pei, and C. Kamath (Eds.), Proceedings of the 10th SIAM International Conference on Data Mining (SDM2010), pp.595-606, Columbus, Ohio, USA, Apr. 29-May 1, 2010.
    [ paper ]

  55. Yamada, M., Sugiyama, M., & Wichern, G.
    Direct importance estimation with probabilistic principal component analyzers.
    In Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP2010), pp.1962-1965, Dallas, Texas, USA, Mar. 14-19, 2010.

  56. Yamada, M., Sugiyama, M., Wichern, G., & Matsui, T.
    Acceleration of sequence kernel computation for real-time speaker identification.
    In Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP2010), pp.1626-1629, Dallas, Texas, USA, Mar. 14-19, 2010.
    [ paper ]

  57. Wichern, G., Yamada, M., Thornburg, H., Sugiyama, M., & Spanias, A.
    Automatic audio tagging using covariate shift adaptation.
    In Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP2010), pp.253-256, Dallas, Texas, USA, Mar. 14-19, 2010.
    [ paper ]

  58. Sugiyama, M.
    Density ratio estimation: A new versatile tool for machine learning.
    In Z.-H. Zhou and T. Washio (Eds.), Advances in Machine Learning, Lecture Notes in Artificial Intelligence, vol.5828, pp.6-9, Berlin, Springer, 2009.
    (Presented at the First Asian Conference on Machine Learning (ACML2009), Nanjing, China, Nov. 2-4, 2009)
    [ paper slides ]

  59. Li, Y., Koike, Y., & Sugiyama, M.
    A framework of adaptive brain computer interfaces.
    In Proceedings of the 2nd International Conference on BioMedical Engineering and Informatics (BMEI09), pp.473-477, Tianjin, China, Oct. 17-19, 2009.
    [ paper ]

  60. Hachiya, H., Peters, J., & Sugiyama, M.
    Efficient sample reuse in EM-based policy search.
    In W. Buntine, M. Grobelnik, D. Mladenic, and J. Shawe-Taylor (Eds.), Machine Learning and Knowledge Discovery in Databases, Lecture Notes in Computer Science, vol.5781, pp.469-484, Berlin, Springer, 2009.
    (Presented at the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD2009), Bled, Slovenia, Sep. 7-11, 2009)
    [ paper ]

  61. Akiyama, T., Hachiya, H., & Sugiyama, M.
    Active policy iteration: Efficient exploration through active learning for value function approximation in reinforcement learning.
    In Proceedings of the Twenty-First International Joint Conference on Artificial Intelligence (IJCAI2009), pp.980-985, Pasadena, California, USA, Jul. 11-17, 2009.
    [ paper, demo (wmv) ]

  62. Suzuki, T., Sugiyama, M., & Tanaka, T.
    Mutual information approximation via maximum likelihood estimation of density ratio.
    In Proceedings of 2009 IEEE International Symposium on Information Theory (ISIT2009), pp.463-467, Seoul, Korea, Jun. 28-Jul. 3, 2009.
    [ paper ]

  63. Jankovic, M. V. & Sugiyama, M.
    Probabilistic principal component analysis based on joystick probability selector.
    In Proceedings of 2009 International Joint Conference on Neural Networks (IJCNN2009), pp.1414-1421, Atlanta, Georgia, USA, Jun. 14-19, 2009.
    [ paper ]

  64. Sugiyama, M., Hachiya, H., Kashima, H., & Morimura, T.
    Least absolute policy iteration for robust value function approximation.
    In A. Bicchi (Ed.), Proceedings of 2009 IEEE International Conference on Robotics and Automation (ICRA2009), pp.2904-2909, Kobe, Japan, May 12-17, 2009.
    [ paper, demo (mp4) ]

  65. Kashima, H., Kato, T., Yamanishi, Y., Sugiyama, M., & Tsuda, K.
    Link propagation: A fast semi-supervised learning algorithm for link prediction.
    In H. Park, S. Parthasarathy, H. Liu, H., and Z. Obradovic (Eds.), Proceedings of 2009 SIAM International Conference on Data Mining (SDM2009), pp.1099-1110, Sparks, Nevada, USA, Apr. 30-May 2, 2009.
    [ paper ]

  66. Kawahara, Y. & Sugiyama, M.
    Change-point detection in time-series data by direct density-ratio estimation.
    In H. Park, S. Parthasarathy, H. Liu, H., and Z. Obradovic (Eds.), Proceedings of 2009 SIAM International Conference on Data Mining (SDM2009), pp.389-400, Sparks, Nevada, USA, Apr. 30-May 2, 2009.
    [ paper ]

  67. Nakajima, S. & Sugiyama,M.
    Analysis of variational Bayesian matrix factorization.
    In T. Theeramunkong, B. Kijsirikul, N. Cercone, and T.-B. Ho (Eds.), Advances in Knowledge Discovery and Data Mining, Lecture Notes in Computer Science, vol.5476, pp.314-326, Berlin, Springer, 2009.
    (Presented at the 13th Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD2009), Bangkok, Thailand, Apr. 27-30, 2009)
    [ paper ]

  68. Yamada, M., Sugiyama, M., & Matsui, T.
    Covariate shift adaptation for semi-supervised speaker identification.
    In Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP2009), pp.1661-1664, Taipei, Taiwan, Apr. 19-24, 2009.
    [ paper, poster ]

  69. Krämer, N., Sugiyama, M., & Braun, M.
    Lanczos approximations for the speedup of kernel partial least squares regression.
    In D. van Dyk and M. Welling (Eds.), Proceedings of the twelfth International Conference on Artificial Intelligence and Statistics (AISTATS2009), JMLR Workshop and Conference Proceedings, vol.5, pp.288-295, Clearwater Beach, Florida, USA, Apr. 16-18, 2009.
    [ paper, poster ]

  70. Suzuki, T. & Sugiyama, M.
    Estimating squared-loss mutual information for independent component analysis.
    In T. Adali, C. Jutten, J. M. T. Romano, and A. K. Barros (Eds.), Independent Component Analysis and Signal Separation, Lecture Notes in Computer Science, vol.5441, pp.130-137, Berlin, Springer, 2009.
    (Presented at 8th International Conference on Independent Component Analysis and Signal Separation (ICA2009), Paraty, Brazil, Mar. 15-18, 2009)
    [ paper ]

  71. Suzuki, T., Sugiyama, M., Kanamori, T., & Sese, J.
    Mutual information estimation reveals global associations between stimuli and biological processes.
    In M. Q. Zhang, M. S. Waterman, and X. Zhang (Eds.), Proceedings of the Seventh Asia-Pacific Bioinformatics Conference (APBC2009), pp.297-309, Beijing, China, Jan. 13-16, 2009.
    [ paper ]

  72. Hido, S., Tsuboi, Y., Kashima, H., Sugiyama, M., & Kanamori, T.
    Inlier-based outlier detection via direct density ratio estimation.
    In F. Giannotti, D. Gunopulos, F. Turini, C. Zaniolo , N. Ramakrishnan, and X. Wu (Eds.), Proceedings of IEEE International Conference on Data Mining (ICDM2008), pp.223-232, Pisa, Italy, Dec. 15-19, 2008.
    [ paper, slides ]

  73. Kanamori, T., Hido, S., & Sugiyama, M.
    Efficient direct density ratio estimation for non-stationarity adaptation and outlier detection.
    In D. Koller, D. Schuurmans, Y. Bengio, and L. Botton (Eds.), Advances in Neural Information Processing Systems 21, pp.809-816, Cambridge, MA, MIT Press, 2009.
    (Presented at Neural Information Processing Systems (NIPS2008), Vancouver, British Columbia, Canada, Dec. 8-13, 2008)
    [ paper, poster ]

  74. Sugiyama, M. & Nakajima, S.
    Pool-based agnostic experiment design in linear regression.
    In W. Daelemans, B. Goethals, and K. Morik (Eds.), Machine Learning and Knowledge Discovery in Databases, Lecture Notes in Computer Science, vol.5212, pp.406-422, Berlin, Springer, 2008.
    (Presented at the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML-PKDD2008), Antwerp, Belgium, Sep. 15-19, 2008)
    [ paper, slides ]

  75. Suzuki, T., Sugiyama, M., Sese, J., & Kanamori, T.
    Approximating mutual information by maximum likelihood density ratio estimation.
    In Y. Saeys, H. Liu, I. Inza, L. Wehenkel, and Y. Van de Peer (Eds.), Proceedings of the Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery 2008 (FSDM2008), JMLR Workshop and Conference Proceedings, vol.4, pp.5-20, 2008.
    (Presented at the ECML-PKDD2008 Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery 2008 (FSDM2008), Antwerp, Belgium, Sep. 15, 2008)
    [ paper ]

  76. Hachiya, H., Akiyama, T., Sugiyama, M., & Peters, J.
    Adaptive importance sampling with automatic model selection in value function approximation.
    In Proceedings of the Twenty-Third AAAI Conference on Artificial Intelligence (AAAI2008), pp.1351-1356, Chicago, Illinois, USA, Jul. 13-17, 2008.
    [ paper ]

  77. Wang, L., Sugiyama, M., Yang, C., Zhou, Z.-H., & Feng, J.
    On the margin explanation of boosting algorithms.
    In R. Servedio and T. Zhang (Eds.), Proceedings of 21st International Conference on Learning Theory (COLT2008), pp.479-490, Helsinki, Finland, Jul. 9-12, 2008.
    [ paper ]

  78. Takeda, A. & Sugiyama, M.
    Nu-support vector machine as conditional value-at-risk minimization.
    In A. McCallum and S. Roweis (Eds.), Proceedings of 25th International Conference on Machine Learning (ICML2008), pp.1056-1063, Helsinki, Finland, Jul. 5-9, 2008.
    [ paper ]

  79. Sugiyama, M., Idé, T., Nakajima, S., & Sese, J.
    Semi-supervised local Fisher discriminant analysis for dimensionality reduction.
    In T. Washio, E. Suzuki, K. M. Ting, and A. Inokuchi (Eds.), Advances in Knowledge Discovery and Data Mining, Lecture Notes in Computer Science, vol.5012, pp.333-344, Berlin, Springer, 2008.
    (Presented at the 12th Pacific-Asia Conference on Knowledge Discovery and Data Mining (PAKDD2008), Osaka, Japan, May 20-24, 2008)
    [ paper, slides ]

  80. Kato, T., Kashima, H., & Sugiyama, M.
    Integration of multiple networks for robust label propagation.
    In M. J. Zaki, K. Wang, C. Apte, and H. Park (Eds.), Proceedings of the Eighth SIAM International Conference on Data Mining (SDM2008), pp.716-726, Atlanta, Georgia, USA, Apr. 24-26, 2008.
    [ paper ]

  81. Sugiyama, M. & Rubens, N.
    Active learning with model selection in linear regression.
    In M. J. Zaki, K. Wang, C. Apte, and H. Park (Eds.), Proceedings of the Eighth SIAM International Conference on Data Mining (SDM2008), pp.518-529, Atlanta, Georgia, USA, Apr. 24-26, 2008.
    [ paper, poster ]

  82. Tsuboi, Y., Kashima, H., Hido, S., Bickel, S., & Sugiyama, M.
    Direct density ratio estimation for large-scale covariate shift adaptation.
    In M. J. Zaki, K. Wang, C. Apte, and H. Park (Eds.), Proceedings of the Eighth SIAM International Conference on Data Mining (SDM2008), pp.443-454, Atlanta, Georgia, USA, Apr. 24-26, 2008.
    [ paper ]

  83. Rubens, N., Sheinman, V., Tokunaga, T., & Sugiyama, M.
    Order retrieval.
    In T. Tokunaga and A. Ortega (Eds.), Large-scale Knowledge Resources, Lecture Notes in Computer Science, vol.4938, pp.310-317, Berlin, Springer, 2008.
    (Presented at the 3rd International Conference on Large-scale Knowledge Resources (LKR2008), Tokyo, Japan, Mar. 3-5, 2008)

  84. Sugiyama, M., Nakajima, S., Kashima, H., von Bünau, P., & Kawanabe, M.
    Direct importance estimation with model selection and its application to covariate shift adaptation.
    In J. C. Platt, D. Koller, Y. Singer, and S. Roweis (Eds.), Advances in Neural Information Processing Systems 20, pp.1433-1440, Cambridge, MA, MIT Press, 2008.
    (Presented at Neural Information Processing Systems (NIPS2007), Vancouver, British Columbia, Canada, Dec. 3-8, 2007)
    [ paper, poster ]

  85. Kato, T., Kashima, H., Sugiyama, M., & Asai, K.
    Multi-task learning via conic programming.
    In J. C. Platt, D. Koller, Y. Singer, and S. Roweis (Eds.), Advances in Neural Information Processing Systems 20, pp.737-744, Cambridge, MA, MIT Press, 2008.
    (Presented at Neural Information Processing Systems (NIPS2007), Vancouver, British Columbia, Canada, Dec. 3-8, 2007)
    [ paper ]

  86. Rubens, N. & Sugiyama, M.
    Influence-based collaborative active learning.
    In Proceedings of the 2007 ACM Conference on Recommender Systems (RecSys2007), pp.145-148, Minneapolis, Minnesota, USA, Oct. 19-20, 2007.
    [ paper ]

  87. Yamazaki, K., Kawanabe, M., Watanabe, S., Sugiyama, M., & Müller, K.-R.
    Asymptotic Bayesian generalization error when training and test distributions are different.
    In Z. Ghahramani (Ed.), Proceedings of 24th International Conference on Machine Learning (ICML2007), pp.1079-1086, Corvallis, Oregon, USA, Jun. 20-24, 2007.
    [ paper, slides ]

  88. Sugiyama, M., Hachiya, H., Towell, C., & Vijayakumar, S.
    Value function approximation on non-linear manifolds for robot motor control.
    In Proceedings of 2007 IEEE International Conference on Robotics and Automation (ICRA2007), pp.1733-1740, Rome, Italy, Apr. 10-14, 2007.
    [ paper, slides, demo (wmv) ]

  89. Storkey, A. & Sugiyama, M.
    Mixture regression for covariate shift.
    In B. Schölkopf, J. C. Platt, and T. Hoffmann (Eds.), Advances in Neural Information Processing Systems 19, pp.1337-1344, Cambridge, MIT Press, 2007.
    (Presented at Neural Information Processing Systems (NIPS2006), Vancouver, British Columbia, Canada, Dec. 4-9, 2006)
    [ paper, poster ]

  90. Sugiyama, M., Blankertz, B., Krauledat, M., Donehege, G., & Müller, K.-R.
    Importance-weighted cross-validation for covariate shift.
    In K. Franke, K.-R. Müller, B. Nickolay, and R. Schäfer (Eds.), Pattern Recognition, Lecture Notes in Computer Science, vol.4147, pp.354-363, Berlin, Springer, 2006.
    (Presented at 28th Annual Symposium of the German Association for Pattern Recognition (DAGM2006), Berlin, Germany, Sep. 12-14, 2006)
    [ paper, slides ]

  91. Tanaka, A., Sugiyama, M., Imai, H., Kudo, M., & Miyakoshi, M.
    Model selection using a class of kernels with an invariant metric.
    In D.-Y. Yeung, J. T. Kwok, A. Fred, F. Roli, and D. de Ridder (Eds.), Structural, Syntactic, and Statistical Pattern Recognition, Lecture Notes in Computer Science, vol.4109, pp.862-870, Berlin, Springer, 2006.
    (Presented at 6th International Workshop on Statistical Pattern Recognition (SPR2006), Hong Kong, China, Aug. 17-19, 2006)
    [ paper ]

  92. Sugiyama, M.
    Local Fisher discriminant analysis for supervised dimensionality reduction.
    In W. W. Cohen and A. Moore (Eds.), Proceedings of 23rd International Conference on Machine Learning (ICML2006), pp.905-912, Pittsburgh, Pennsylvania, USA, Jun. 25-29, 2006.
    [ paper, slides ]

  93. Sugiyama, M., Kawanabe, M., Blanchard, G., Spokoiny, V., & Müller, K.-R.
    Obtaining the best linear unbiased estimator of noisy signals by non-Gaussian component analysis.
    In Proceedings of 2006 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP2006), vol, 3, pp.608-611, Toulouse, France, May 14-19, 2006.
    [ paper, poster ]

  94. Kawanabe, M., Blanchard, G., Sugiyama, M., Spokoiny, V., & Müller, K.-R.
    A novel dimension reduction procedure for searching non-Gaussian subspaces.
    In J. Rosca, D. Erdogmus, J. C. Príncipe, and S. Haykin (Eds.), Independent Component Analysis and Blind Signal Separation, Lecture Notes in Computer Science, vol.3889, pp.149-156, Berlin, Springer, 2006.
    (Presented at 6th International Conference on Independent Component Analysis and Blind Signal Separation (ICA2006), Charleston, South Carolina, USA, March 5-8, 2006)
    [ paper ]

  95. Sugiyama, M.
    Active learning for misspecified models.
    In Y. Weiss, B. Schölkopf, and J. Platt (Eds.), Advances in Neural Information Processing Systems 18, pp.1305-1312, Cambridge, MIT Press, 2006.
    (Presented at Neural Information Processing Systems (NIPS2005), Vancouver, British Columbia, Canada, Dec. 5-10, 2005)
    [ paper, poster ]

  96. Blanchard, G., Sugiyama, M., Kawanabe, M., Spokoiny, V., & Müller, K.-R.
    Non-Gaussian component analysis: A semiparametric framework for linear dimension reduction.
    In Y. Weiss, B. Schölkopf, and J. Platt (Eds.), Advances in Neural Information Processing Systems 18, pp.131-138, Cambridge, MIT Press, 2006.
    (Presented at Neural Information Processing Systems (NIPS2005), Vancouver, British Columbia, Canada, Dec. 5-8, 2005)
    [ paper, poster ]

  97. Sugiyama, M. & Müller, K.-R.
    Model selection under covariate shift.
    In W. Duch, J. Kacprzyk, E. Oja, and S. Zadrozny (Eds.), Artificial Neural Networks: Formal Models and Their Applications, Lecture Notes in Computer Science, vol.3697, pp.235-240, Berlin, Springer, 2005.
    (Presented at International Conference on Artificial Neural Networks (ICANN2005), Warsaw, Poland, Sep.11-15, 2005)
    [ paper, slides ]

  98. Sugiyama, M. & Ogawa, H.
    Designing kernel functions using the Karhunen-Loève expansion.
    In Proceedings of Sixteenth International Symposium on Mathematical Theory of Networks and Systems (MTNS2004), pp.N/A(CD-ROM), Leuven, Belgium, Jul. 5-9, 2004.
    [ paper, slides ]

  99. Sugiyama, M., Kawanabe, M., & Müller, K.-R.
    Regularizing generalization error estimators: A novel approach to robust model selection.
    In Proceedings of the 12th European Symposium on Artificial Neural Networks (ESANN2004), pp.163-168, Bruges, Belgium, Apr. 28-30, 2004.
    [ paper, poster ]

  100. Sugiyama, M.
    Estimating the error at given test input points for linear regression.
    In M. H. Hamza (Ed.), Neural Networks and Computational Intelligence, pp.113-118, ACTA Press, Anaheim, 2004.
    (Presented at the Second IASTED International Conference on Neural Networks and Computational Intelligence (NCI2004), Grindelwald, Switzerland, Feb. 23-25, 2004)
    [ paper, slides ]

  101. Sugiyama, M., Okabe, Y., & Ogawa, H.
    On the influence of input noise on a generalization error estimator.
    In M. H. Hamza (Ed.), Artificial Intelligence and Applications, pp.218-223, ACTA Press, Anaheim, 2004.
    (Presented at the IASTED International Conference on Artificial Intelligence and Applications (AIA2004), Innsbruck, Austria, Feb. 16-18, 2004.
    [ paper, slides ]

  102. Sugiyama, M.
    Functional analytic framework for model selection.
    In Proceedings of 13th IFAC Symposium on System Identification (SYSID2003), pp.73-78, Rotterdam, The Netherlands, Aug. 27-29, 2003.
    [ paper, slides ]

  103. Sugiyama, M.
    Model selection for support vector regression.
    Information Technology Letters, vol.1, pp.115-116, 2002.
    (Presented at Forum on Information Technology (FIT2002), Tokyo, Japan, Sep. 25-28, 2002)
    [ paper in Japanese, slides in Japanese ]

  104. Sugiyama, M. & Müller, K.-R.
    Selecting ridge parameters in infinite dimensional hypothesis spaces.
    In J. R. Dorronsoro (Ed.), Artificial Neural Networks, Lecture Notes in Computer Science, vol.2415, pp.528-534, Berlin, Springer, 2002.
    (Presented at International Conference on Artificial Neural Networks (ICANN2002), Madrid, Spain, Aug. 27-30, 2002)
    [ paper, poster ]

  105. Sugiyama, M. & Ogawa, H.
    Release from active learning/model selection dilemma: Optimizing sample points and models at the same time.
    In Proceedings of International Joint Conference on Neural Networks (IJCNN2002), vol.3, pp.2917-2922, Honolulu, Hawaii, USA, May 12-17, 2002.
    [ paper, slides ]

  106. Sugiyama, M., Imaizumi, D., & Ogawa, H.
    Subspace information criterion for image restoration---Mean squared error estimator for linear filters.
    In Proceedings of the 12th Scandinavian Conference on Image Analysis (SCIA2001), pp.169-176, Bergen, Norway, Jun. 11-14, 2001.
    [ paper, slides ]

  107. Sugiyama, M. & Ogawa, H.
    Model selection with small samples.
    In V. Kurkova, N. C. Steele, R. Neruda, and M. Karny (Eds.), Artificial Neural Nets and Genetic Algorithms, pp.418-421, Wien, Springer, 2001.
    (Presented at 5th International Conference on Artificial Neural Networks and Genetic Algorithms (ICANNGA2001), Prague, Czech Republic, Apr. 22-25, 2001)
    [ paper, slides ]

  108. Sugiyama, M. & Ogawa, H.
    Incremental active learning with bias reduction.
    In Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks (IJCNN2000), vol.1, pp.15-20, Como, Italy, Jul. 24-27, 2000.
    [ paper, slides ]

  109. Sugiyama, M. & Ogawa, H.
    A new information criterion for the selection of subspace models.
    In Proceedings of the 8th European Symposium on Artificial Neural Networks (ESANN2000), pp.69-74, Bruges, Belgium, Apr. 26-28, 2000.
    [ paper, slides ]

  110. Sugiyama, M. & Ogawa, H.
    Training data selection for optimal generalization in trigonometric polynomial networks.
    In S. A. Solla, T. K. Leen, and K. -R. Müller (Eds.), Advances in Neural Information Processing Systems 12, pp.624-630, Cambridge, MIT Press, 2000.
    (Presented at Neural Information Processing Systems---Natural and Synthetic (NIPS1999), Denver, Colorado, USA, Nov.29-Dec.4, 1999)
    [ paper, poster ]

  111. Sugiyama, M. & Ogawa, H.
    Pseudo orthogonal bases give the optimal generalization capability in neural network learning.
    In Proceedings of SPIE, Wavelet Applications in Signal and Image Processing VII, vol.3813, pp.526-537, Denver, Colorado, USA, Jul. 19-23, 1999.
    [ paper, slides ]

  112. Sugiyama, M. & Ogawa, H.
    Exact incremental projection learning in the presence of noise.
    In Proceedings of the 11th Scandinavian Conference on Image Analysis (SCIA1999), pp.747-754, Kangerlussuaq, Greenland, Jun. 7-11, 1999.
    [ paper, poster ]

  113. Vijayakumar, S., Sugiyama, M., & Ogawa, H.
    Training data selection for optimal generalization with noise variance reduction in neural networks.
    In M. Marinaro and R. Tagliaferri (Eds.), Neural Nets WIRN Vietri-98, pp.153-166, London, Springer, 1998.
    (Presented at the 10th Italian Workshop on Neural Nets (WIRN Vietri-98), Salerno, Italy, May 21-23, 1998)
    [ paper, poster ]


Books

  1. Sugiyama, M., Ide, T., Kamishima, T., Kurita, T., & Maeda, E. (Eds.), Ijiri, Y., Ide, T., Iwata, T., Kanamori, T., Kanemura, A., Karasuyama, M., Kawahara, Y., Kimura, A., Konishi, Y. Sakai, T., Suzuki, T., Takeuchi, I., Tamaki, T., Deguchi, D., Tomioka, R., Habe, H., Maeda, S., Mochihashi, D., & Yamada, M. (Trans.)
    Elements of Statistical Learning: Data Mining, Inference, and Prediction,
    888 pages, Kyoritsu Publishing, Tokyo, Japan, 2014.

  2. Sugiyama, M.
    An Illustrated Guide to Machine Learning,
    230 pages, Kodansha, Tokyo, Japan, 2013.

  3. Sugiyama, M., Suzuki, T., & Kanamori, T.
    Density Ratio Estimation in Machine Learning,
    344 pages, Cambridge University Press, Cambridge, UK, 2012.
    [Preview by Google Books]

  4. Sugiyama, M. & Kawanabe, M.
    Machine Learning in Non-Stationary Environments: Introduction to Covariate Shift Adaptation,
    308 pages, MIT Press, Cambridge, MA, USA, 2012.
    [Preview by Google Books]

  5. M. Sugiyama and Q. Yang (Eds.).
    Proceedings of the Second Asian Conference on Machine Learning (ACML2010),
    346 pages, JMLR Workshop and Conference Proceedings, vol.13, Tokyo, Japan, 2010.

  6. Sugiyama, M.
    Statistical Pattern Recognition: Pattern Recognition Based on Generative Models,
    198 pages, Ohmsha, Tokyo, Japan, 2009.

  7. Quiñonero-Candela, J., Sugiyama, M., Schwaighofer, A., & Lawrence, N. D. (Eds.),
    Dataset Shift in Machine Learning,
    248 pages, MIT Press, Cambridge, MA, USA, 2009.

  8. Hachiya, H. & Sugiyama, M.
    Training Robotic Game Players by Reinforcement Learning,
    219 pages, Mainichi Communications, Tokyo, Japan, 2008.
    [Preview by Google Books]

  9. Motoda, H., Kurita, T., Higuchi, T., Matsumoto, Y., & Murata, N. (Eds.), Akaho, S., Kamishima, T., Sugiyama, M., Onoda, T., Ikeda, K., Kashima, H., Kazawa, H., Nakajima, S., Takeuchi, J., Mochihashi, D., Oyama, S., Ide, T., Shinoda, K., & Yamakawa, H. (Trans.)
    Pattern Recognition and Machine Learning (II): Statistical Inference based on Bayes Theory,
    433 pages, Maruzen Publishing, Tokyo, Japan, 2008.

  10. Motoda, H., Kurita, T., Higuchi, T., Matsumoto, Y., & Murata, N. (Eds.), Akaho, S., Kamishima, T., Sugiyama, M., Onoda, T., Ikeda, K., Kashima, H., Kazawa, H., Nakajima, S., Takeuchi, J., Mochihashi, D., Oyama, S., Ide, T., Shinoda, K., & Yamakawa, H. (Trans.)
    Pattern Recognition and Machine Learning (I): Statistical Inference based on Bayes Theory,
    349 pages, Maruzen Publishing, Tokyo, Japan, 2007.


Articles in Books

  1. Sugiyama, M.
    Big data analysis by density ratio estimation.
    Journal of the Institute of Electronics, Information and Communication Engineers, vol.97, no.5, pp.353-358, 2014.
    [ paper in Japanese ]

  2. Sugiyama, M.
    Big data analysis by least-squares.
    Big Data Management: Data Analysis Technology and Application for Data Scientists, pp.81-87, NTS Inc., Tokyo, 2014.
    [ paper in Japanese ]

  3. Sugiyama, M.
    Density ratio estimation.
    Handbook on Applied Mathematics, pp.588-589, 2013.
    [ paper in Japanese ]

  4. Sugiyama, M.
    Direct approximation of divergences between probability distributions.
    In B. Schölkopf, Z. Luo, and V. Vovk (Eds.), Empirical Inference: Festschrift in Honor of Vladimir N. Vapnik, Chapter 23, pp.273-283, Springer, Berlin, Germany, 2013.
    [ paper ]

  5. Takeuchi, I., Ogawa, K., & Sugiyama, M.
    Parametric programming approach for non-convex problems in machine learning.
    In Bridge between Theory and Application in Optimization Method, Research Institute for Mathematical Sciences Kokyuroku, no.1829, pp.23-38, 2013. (Presented at Research Institute for Mathematical Sciences Workshop on the Bridge between Theory and Application in Optimization Method, Kyoto, Japan, Jul. 23-24, 2012)
    [ paper in Japanese ]

  6. Sugiyama, M.
    Statistical machine learning based on density ratio estimation.
    Telecom Frontier, no.81, pp.1-6, 2013.
    [ paper in Japanese ]

  7. Sugiyama, M.
    Function approximation.
    IEICE Handbook on Knowledge Base, no.S3-4-1-5, pp.21-23, 2012.
    [ paper in Japanese ]

  8. Sugiyama, M.
    Automatic data clustering by machine learning.
    Simulation, vol.31, no.2, pp.36-40, 2012.
    [ paper in Japanese ]

  9. Sugiyama, M.
    Learning under non-stationarity: Covariate shift adaptation by importance weighting.
    In J. E. Gentle, W. Härdle, and Y. Mori (Eds.), Handbook of Computational Statistics: Concepts and Methods, 2nd edition, Chapter 31, pp.927-952, Springer, Berlin, Germany, 2012.
    [ paper ]

  10. Sugiyama, M.
    Introduction to machine learning.
    Communicatinos of the Operations Research Society of Japan, vol.57, no.7, pp.353-359, 2012.
    [ paper in Japanese ]

  11. Tomioka, R., Suzuki, T., & Sugiyama, M.
    Augmented Lagrangian methods for learning, selecting, and combining features.
    In S. Sra, S. Nowozin, and S. J. Wright (Eds.), Optimization for Machine Learning, Chapter 9, pp.255-283, MIT Press, Cambridge, MA, USA, 2011.
    [ paper ]

  12. Ueki, K., Ihara, Y., & Sugiyama, M.
    Perceived age estimation from face images.
    In G. Chetty and J. Yang (Eds.), Advanced Biometric Technologies, Chapter 16, pp.325-342, InTech, Rijeka, Croatia, 2011.
    [ paper ]

  13. Rubens, N., Kaplan, D., & Sugiyama, M.
    Active learning in recommender systems.
    In F. Ricci, L. Rokach, B. Shapira, and P. B. Kantor (Eds.), Recommender Systems Handbook, Chapter 23, pp.735-767, Springer, New York, NY, USA, 2010.

  14. Sugiyama, M., Suzuki, T., & Kanamori, T.
    Density ratio estimation: A comprehensive review.
    In Statistical Experiment and Its Related Topics, Research Institute for Mathematical Sciences Kokyuroku, no.1703, pp.10-31, 2010.
    (Presented at Research Institute for Mathematical Sciences Workshop on Statistical Experiment and Its Related Topics, Kyoto, Japan, Mar. 8-10, 2010)
    [ paper ]

  15. Tomioka, R., Suzuki, T., & Sugiyama, M.
    Optimization algorithms for sparse regularization and multiple kernel learning and their applications to image recognition.
    Image Lab, vol.21, no.4, pp.5-11, 2010.
    [ paper in Japanese ]

  16. Sugiyama, M., Rubens, N., & Müller, K.-R.
    A conditional expectation approach to model selection and active learning under covariate shift.
    In J. Quiñonero-Candela, M. Sugiyama, A. Schwaighofer, and N. Lawrence (Eds.), Dataset Shift in Machine Learning, Chapter 7, pp.107-130, MIT Press, Cambridge, MA, USA, 2009.
    [ paper ]

  17. Kitagawa, K., Sugiyama, M., Matsuzaka, T., Ogawa, H., & Suzuki, K.
    Two-wavelength single-shot interferometry.
    Image Lab, vol.19, no.10, pp.37-43, 2008.
    [ paper in Japanese ]

  18. Kitagawa, K., Sugiyama, M., Matsuzaka, T., Ogawa, H., & Suzuki, K.
    Two-wavelength single-shot interferometry.
    Eizojoho Industrial, vol.40, no.2, pp.51-58, 2008.

  19. Sugiyama, M.
    Supervised learning under nonstationary environment: when input distribution changes.
    Image Lab, vol.18, no.10, pp.1-6, 2007.
    [ paper in Japanese ]

  20. Sugiyama, M.
    Supervised learning under covariate shift.
    The Brain & Neural Networks, vol.13, no.3, pp.111-118, 2006.
    [ paper in Japanese ]

  21. Ogawa, H. & Sugiyama, M.
    Active learning for maximal generalization capability.
    In Theories and Applications of Reproducing Kernels, Research Institute for Mathematical Sciences Kokyuroku, no.1352, pp.114-126, 2004.
    (Presented at Research Institute for Mathematical Sciences Workshop on Theories and Applications of Reproducing Kernels, Kyoto, Japan, Oct. 9-10, 2003)
    [ paper ]


Others

  1. Irie, K., Sugiyama, M., & Tomonoh, M.
    Self-localication in unknown environments using a two-dimensional street map.
    In Proceedings of the 32nd Annual Conference of the Robotics Society of Japan, RSJ2014AC2I2-02, 4 pages, Fukuoka, Japan, Sep. 4-6, 2014.
  2. Sakai, T., Sugiyama, M., Kitagawa, K., & Suzuki, K.
    Registration of infrared-transmission images using squared-loss mutual information.
    In Proceedings of
    the Japan Society for Precision Engineering 2014 Spring Meeting, pp.973-974, Tokyo, Japan, Mar. 18-20, 2014.

  3. Sugiyama, M. & Takeuchi, I.
    Activity group news: Introduction to machine learning activity group.
    JSIAM Online Magazine, Feb. 11, 2014.

  4. Shiga, M. & Sugiyama, M.
    Conditional density estimation with feature selection.
    IEICE Technical Report, NC2013-56, pp.17-22, Gifu, Japan, Dec. 21, 2013.

  5. Jaak, S., Magrans de Abril, I., & Sugiyama, M.
    Tree-based ensemble multi-task learning method for classification and regression.
    Presented at New Directions in Transfer and Multi-Task: Learning Across Domains and Tasks, NIPS2013 Workshop, Lake Tahoe, Nevada, USA, Dec. 9, 2013.

  6. Suzumura, M., Takeuchi, I., & Sugiyama, M.
    Properties of the solution of robust support vector regression and parametric non-convex optimization.
    In Proceedings of 11th Workshop on Informatics (WiNF2013), pp.27-32, Nagoya, Japan, Nov. 30- Dec. 1, 2013.

  7. Xie, N., Zhao, T., & Sugiyama, M.
    Personal style learning in sumi-e stroke-based rendering by machine learning.
    IPSJ SIG Technical Report, vol.2013-CG-153, no.23, pp.1-6, Fukuoka, Japan, Nov. 28-29, 2013.

  8. Nakajima, S., Takeda, A., Babacan, D., Sugiyama, M., & Takeuchi, I.
    Global solver for variational Bayesian low-rank subspace clustering.
    IEICE Technical Report, IBISML2013-37, pp.7-14, Tokyo, Japan, Nov. 10-13, 2013.
    (Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013)

  9. Shiga, M. & Sugiyama, M.
    Direct conditional probability density estimation based on sparse additive models.
    IEICE Technical Report, IBISML2013-43, pp.53-60, Tokyo, Japan, Nov. 10-13, 2013.
    (Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013)

  10. Suzumura, M., Takeuchi, I., & Sugiyama, M.
    Non-convex optimizatino of robust support vector machines by parametric programming.
    IEICE Technical Report, IBISML2013-45, pp.69-75, Tokyo, Japan, Nov. 10-13, 2013.
    (Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013)

  11. Hocking, T. D., Spanurattana, S., & Sugiyama, M.
    Support vector comparison machines.
    IEICE Technical Report, IBISML2013-51, pp.115-121, Tokyo, Japan, Nov. 10-13, 2013.
    (Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013)

  12. Sakai, T. & Sugiyama, M.
    Computationally efficient estimation of squared-loss mutual information with multiplicative kernel models.
    IEICE Technical Report, IBISML2013-53, pp.131-137, Tokyo, Japan, Nov. 10-13, 2013.
    (Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013)

  13. Takeuchi, I., Hongo, T., Sugiyama, M., & Nakajima, S.
    Learning common features of parametrized tasks.
    IEICE Technical Report, IBISML2013-66, pp.225-232, Tokyo, Japan, Nov. 10-13, 2013.
    (Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013)

  14. Shiino H., du Plessis, M. C., & Sugiyama, M.
    Online least-squares density-ratio estimation.
    Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013.

  15. Xie, N., Zhao, T., & Sugiyama, M.
    Personal style learning in sumi-e stroke-based rendering by inverse reinforcement learning.
    Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013.

  16. Ma, Y., Zhao, T., & Sugiyama, M.
    An optimal online policy gradient algorithm for continuous state and action Markov decision processes.
    Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013.

  17. Irie, K., Sugiyama, M., & Tomono, M.
    Road recognition from a single image using prior knowledge.
    Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013.

  18. du Plessis, M. C. & Sugiyama, M.
    Class prior estimation from positive and unlabeled data.
    Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013.

  19. Zhang, H. & Sugiyama, M.
    Budget allocation and adaptive task assignment in crowdsourcing.
    Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013.

  20. Tangkaratt, V. & Sugiyama, M.
    Least-squares conditional density estimation with dimensionality reduction.
    Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013.

  21. Sasaki, H. & Sugiyama, M.
    On estimation of log-density derivatives.
    Presented at 2013 Workshop on Information-Based Induction Sciences (IBIS2013), Tokyo, Japan, Nov. 10-13, 2013.

  22. Irie, K., Tomono, M., & Sugiyama, M.
    Road recognition using prior knowledge and spatial dependency in an image.
    In Proceedings of 31st Annual Conference of the RSJ, 1M2-02, Tokyo, Japan, Sep. 4-6, 2013.

  23. Suzumura, S., Ogawa, K., Takeuchi, I., & Sugiyama, M.
    Optimization for robust support vector machine utilizing homotopy approach.
    IEICE Technical Report, IBISML2013-16, pp.19-24, Tottori, Japan, Sep. 2-3, 2013.

  24. Ihara, Y. & Sugiyama, M.
    Impression estimation from face images by ranking learning.
    Meeting on Image Recognition and Understanding 2013 (MIRU2013), SS4-31, Tokyo, Jul. 29-Aug. 1, 2013.

  25. Liu, S., Quinn, J., Gutmann, M. U., & Sugiyama, M.
    Direct learning of sparse changes in Markov networks by density ratio estimation.
    IEICE Technical Report, IBISML2013-12, pp.81-88, Tokyo, Japan, Jul. 18, 2013.

  26. Sainui, J. & Sugiyama, M.
    Direct approximation of quadratic mutual information and its application to dependence-maximization clustering.
    IEICE Technical Report, IBISML2013-3, pp.15-18, Tokyo, Japan, Jul. 18, 2013.

  27. du Plessis, M. C., & Sugiyama, M.
    Direct approximation of quadratic mutual information and its application to dependence-maximization clustering.
    IEICE Technical Report, IBISML2013-3, pp.15-18, Tokyo, Japan, Jul. 18, 2013.

  28. Nakata, K., Sugiyama, M., Kitagawa, K., & Otsuki, M.
    Improved algorithm for multiwavelength single-shot interferometric surface profiling: Speeding up the multiwavelength-integrated local model fitting method by local information sharing.
    In Proceedings of the Japan Society for Precision Engineering 2013 Student Session, S26, pp.137-138, Tokyo, Japan, Mar. 13, 2013.
    (This paper was selected for Best Presentation Award)

  29. Mori, S., Tangkaratt, V., Zhao, T., Morimoto, J., & Sugiyama, M.
    Model-based policy gradients with parameter-based exploration by least-squares conditional density estimation.
    IEICE Technical Report, IBISML2012-95, pp.17-24, Nagoya, Japan, Mar. 4-5, 2013.

  30. Nguyen, T. D., du Plessis, M. C., Kanamori, T., & Sugiyama, M.
    Constrained least-squares density-difference estimation.
    IEICE Technical Report, IBISML2012-104, pp.79-86, Nagoya, Japan, Mar. 4-5, 2013.

  31. Sugiyama, M.
    Divergence estimation for machine learning and signal processing.
    2013 IEEE International Winter Workshop on Brain-Computer Interface (BCI2013), pp.12-13, High 1 Resort, Korea, Feb. 18-20, 2013.

  32. Khan, R. R. & Sugiyama, M.
    Least squares conditional density estimation in semi-supervised learning settings.
    In Proceedings of 7th International Conference on Electrical and Computer Engineering (ICECE2012), pp.109-112, Dhaka, Bangladesh, Dec. 20-22, 2012,

  33. Ogawa, K., Takeuchi, I., & Sugiyama, M.
    A homotopy approach for nonconvex disjunctive programs in machine learning.
    5th NIPS Workshop on Optimization for Machine Learning (OPT2012), Lake Tahoe, Nevada, USA, Dec. 8, 2012.

  34. Nakajima, S., Tomioka, R., Sugiyama, M., & Babacan, D.
    On dimensionality recovery guarantee of variational Bayesian PCA.
    IEICE Technical Report, IBISML2012-66, pp.229-236, Tokyo, Japan, Nov. 7-9, 2012.
    (Presented at 2012 Workshop on Information-Based Induction Sciences (IBIS2012), Tokyo, Japan, Nov. 7-9, 2012)

  35. Yamada, M., Jitkrittum, W., Sigal, L., Xing, E. P., & Sugiyama, M.
    High-dimensional feature selection by feature-wise kernelized lasso.
    Presented at 2012 Workshop on Information-Based Induction Sciences (IBIS2012), Tokyo, Japan, Nov. 7-9, 2012.

  36. Yamanaka, M., Matsugu, M., & Sugiyama, M.
    Salient object detection based on direct density-ratio estimation.
    IPSJ SIG Technical Report, vol.2012-MPS-90, no.5, pp.1-8, Hokkaido, Japan, Sep. 19-20, 2012.

  37. Yamanaka, M., Matsugu, M., & Sugiyama, M.
    Detection of activities and events without explicit categorization.
    IPSJ SIG Technical Report, vol.2012-MPS-90, no.6, pp.1-8, Hokkaido, Japan, Sep. 19-20, 2012.

  38. Kimura, A., Sugiyama, M., Nakano, T., Kameoka, H., Sakano, H., Maeda, E., & Ishiguro, K.
    SemiCCA: Efficient semi-supervised learning of canonical correlations.
    IPSJ SIG Technical Report, vol.2012-MPS-90, no.22, pp.1-6, Hokkaido, Japan, Sep. 19-20, 2012.

  39. Kimura, A., Sugiyama, M., Sakano, H., & Kameoka, H.
    Designing various multivariate analysis at will via generalized pairwise expression.
    IPSJ SIG Technical Report, vol.2012-MPS-90, no.23, pp.1-6, Hokkaido, Japan, Sep. 19-20, 2012.

  40. Takeuchi, I., Ogawa, K., & Sugiyama, M.
    Homotopy methods for non-convex optimization in machine learning.
    In Proceedings of Annual Conference on the Japan Society for Industrial and Applied Mathematics, pp.169-170, Hokkaido, Japan, Aug. 29-Sep. 1, 2012.

  41. Sugiyama, M.,
    Machine learning by density ratio estimation.
    In Proceedings of Annual Conference on the Japan Society for Industrial and Applied Mathematics, pp.173-174, Hokkaido, Japan, Aug. 29-Sep. 1, 2012.

  42. Takagi, J., Sugiyama, M., Kimura, A., Hachiya, H., Ohishi, Y., & Yamada, M.
    Automatic media annotation with simple semi-supervised probabilitic classifiers.
    In Proceedings of Meeting on Image Recognition and Understanding 2012 (MIRU2012), OS8-02, Fukuoka, Japan, Aug. 6-8, 2012.

  43. Nakajima, S., Sugiyama, M., & Babacan, D.
    Foreground/background video separation by segmentation-based generalized matrix factorization.
    In Proceedings of Meeting on Image Recognition and Understanding 2012 (MIRU2012), OS14-05, Fukuoka, Japan, Aug. 6-8, 2012.

  44. Sugiyama, M.
    Overview of recent advances in density ratio estimation.
    Presented at the 2nd Institute of Mathematical Statistics Asia Pacific Rim Meeting (IMS-APRM2012), Tsukuba, Japan, Jul. 2-4, 2012.

  45. Kanamori, T., Suzuki, T., & Sugiyama, M.
    F-divergence estimation and two-sample test under semiparametric density-ratio models.
    Presented at the 2nd Institute of Mathematical Statistics Asia Pacific Rim Meeting (IMS-APRM2012), Tsukuba, Japan, Jul. 2-4, 2012.

  46. Zhao, T., Hachiya, H., & Sugiyama, M.
    Efficient data reuse in robot control learning via importance sampling.
    Presented at the 2nd Institute of Mathematical Statistics Asia Pacific Rim Meeting (IMS-APRM2012), Tsukuba, Japan, Jul. 2-4, 2012.

  47. Yamada, M. & Sugiyama, M.
    Multimedia information processing with mutual information.
    Presented at the 2nd Institute of Mathematical Statistics Asia Pacific Rim Meeting (IMS-APRM2012), Tsukuba, Japan, Jul. 2-4, 2012.

  48. Ogawa, K., Takeuchi, I., & Sugiyama, M.
    A study on an optimization algorithm for semi-supervised SVM using parametric programing.
    IEICE Technical Report, IBISML2012-1, pp.1-8, Kyoto, Japan, Jun. 19-20, 2012.

  49. Sugiyama, M., Kanamori, T., Suzuki, T., du Plessis, M. C., Liu, S., & Takeuchi, I.
    Density difference estimation.
    IEICE Technical Report, IBISML2012-8, pp.49-56, Kyoto, Japan, Jun. 19-20, 2012.

  50. Magrans de Abril, I. & Sugiyama, M.
    Winning the Kaggle Algorithmic Trading Challenge with the composition of many models and feature engineering.
    IEICE Technical Report, IBISML2012-12, pp.79-84, Kyoto, Japan, Jun. 19-20, 2012.

  51. Yamashita, A., Sugiyama, M., Kitagawa, K., & Kobayashi, H.
    Multiwavelength-integrated local model fitting method for interferometric surface profiling.
    In Proceedings of the Japan Society for Precision Engineering 2012 Spring Meeting, pp.1027-1028, Tokyo, Japan, Mar. 14-16, 2012.

  52. Zhao, T., Hachiya, H., & Sugiyama, M.
    Importance-weighted policy gradients with parameter-based exploration.
    IEICE Technical Report, IBISML2011-95, pp.55-62, Tokyo, Japan, Mar. 12-13, 2012.

  53. du Plessis, M. C. & Sugiyama, M.
    Semi-supervised learning of class-prior probabilities under class-prior change.
    IEICE Technical Report, IBISML2011-102, pp.103-108, Tokyo, Japan, Mar. 12-13, 2012.

  54. Kurihara, N. & Sugiyama, M.
    Improving importance estimation in pool-based batch active learning for approximate linear regression.
    IEICE Technical Report, IBISML2011-105, pp.123-130, Tokyo, Japan, Mar. 12-13, 2012.

  55. Kobayashi, T. & Sugiyama, M.
    Early stopping heuristics in pool-based incremental active learning for least-squares probabilistic classifier.
    IEICE Technical Report, IBISML2011-106, pp.131-138, Tokyo, Japan, Mar. 12-13, 2012.

  56. Jitkrittum, W., Hachiya, H., & Sugiyama, M.
    Feature selection via l1-penalized squared-loss mutual information.
    IEICE Technical Report, IBISML2011-197, pp.139-146, Tokyo, Japan, Mar. 12-13, 2012.

  57. Niu, G., Jitkrittum, W., Hachiya, H., Dai, B., & Sugiyama, M.
    Squared-loss mutual information regularization.
    IEICE Technical Report, IBISML2011-108, pp.147-153, Tokyo, Japan, Mar. 12-13, 2012.

  58. Sugiyama, M.
    Density-ratio estimation: A new versatile tool for machine learning.
    Presented at Japanese-French Frontiers of Science Symposium (JFFoS) Nice, France, Jan. 20-22, 2012.

  59. Yamanaka, M., Sugiyama, M., & Matsugu, M.
    Detection of events without the event category learning.
    In Proceedings of Vision Engineering Workshop 2011 (ViEW2011), pp.235-242, Yokohama, Japan, Dec. 8-9, 2011.

  60. Yamada, M., Suzuki, T., Kanamori, T., Hachiya, H., & Sugiyama, M.
    Relative density-ratio estimation for robust distribution comparison.
    IEICE Technical Report, IBISML2011-46, pp.25-32, Nara, Japan, Nov. 9-11, 2011.
    (Presented at 2011 Workshop on Information-Based Induction Sciences (IBIS2011), Nara, Japan, Nov. 9-11, 2011)

  61. Hachiya, H., Morimura, T., Makino, T., & Sugiyama, M.
    Modified Newton approach to policy search.
    IEICE Technical Report, IBISML2011-54, pp.79-85, Nara, Japan, Nov. 9-11, 2011.
    (Presented at 2011 Workshop on Information-Based Induction Sciences (IBIS2011), Nara, Japan, Nov. 9-11, 2011)

  62. Nakajima, S., Sugiyama, M., & Babacan, D.
    Generalization of matrix factorization for robust PCA.
    IEICE Technical Report, IBISML2011-61, pp.127-134, Nara, Japan, Nov. 9-11, 2011.
    (Presented at 2011 Workshop on Information-Based Induction Sciences (IBIS2011), Nara, Japan, Nov. 9-11, 2011)

  63. Liu, S., Yamada, M., & Sugiyama, M.
    Change-point detection in time-series data by relative density-ratio estimation.
    IEICE Technical Report, IBISML2011-70, pp.187-198, Nara, Japan, Nov. 9-11, 2011.
    (Presented at 2011 Workshop on Information-Based Induction Sciences (IBIS2011), Nara, Japan, Nov. 9-11, 2011)

  64. Nam, H., Hachiya, H., & Sugiyama, M.
    Computationally efficient multi-label classification by least-squares probabilistic classifier.
    IEICE Technical Report, IBISML2011-73, pp.213-216, Nara, Japan, Nov. 9-11, 2011.
    (Presented at 2011 Workshop on Information-Based Induction Sciences (IBIS2011), Nara, Japan, Nov. 9-11, 2011)

  65. Yamanaka, M., Matsugu, M., & Sugiyama, M.
    Automatic detection of regions of interest based on density ratio estimation.
    In Proceedings of 2011 Annual Conference of I.E.E. of Japan, Industry Applications Society,, no.2-S9-4, pp.143-149, Okinawa, Japan, Sep. 6-8, 2011.

  66. Nakajima, S., Sugiyama, M., & Babacan, D.
    Global solution of variational Bayesian matrix factorization under matrix-wise independence.
    IEICE Technical Report, IBISML2011-17, pp.1-8, Hokkaido, Japan, Sep. 5-6, 2011.

  67. Karasuyama, M. & Sugiyama, M.
    Canonical correlation analysis based on squared-loss mutual information.
    IEICE Technical Report, IBISML2011-38, pp.173-180, Hokkaido, Japan, Sep. 5-6, 2011.

  68. Xie, N., Hachiya, H., & Sugiyama, M.
    Artist agent (A^2): Stroke painterly rendering based on reinforcement learning.
    IEICE Technical Report, IBISML2011-30, pp.119-126, Hokkaido, Japan, Sep. 5-6, 2011.

  69. Kimura, A., Sugiyama, M., Kameoka, H., & Sakano, H.
    Generalized multivariate analysis wih extended pairwise expression.
    In Proceedings of Meeting on Image Recognition and Understanding 2011 (MIRU2011), pp.10-17, Ishikawa, Japan, Jul. 20-22, 2011.

  70. Zhao, T., Hachiya, H., Niu, G., & Sugiyama, M.
    Analysis and improvement of policy gradient estimation.
    IEICE Technical Report, IBISML2011-12, pp.83-89, Tokyo, Japan, Jun. 20-21, 2011.

  71. du Plessis, M. C., Yamada, M., & Sugiyama, M.
    Estimation of squared-loss mutual information from paired and unpaired samples.
    IEICE Technical Report, IBISML2011-11, pp.75-81, Tokyo, Japan, Jun. 20-21, 2011.

  72. Niu, G., Dai, B., Yamada, M., & Sugiyama, M.
    SERAPH: Semi-supervised metric learning paradigm with hyper sparsity.
    IEICE Technical Report, IBISML2011-8, pp.51-58, Tokyo, Japan, Jun. 20-21, 2011.

  73. Yamada, M. & Sugiyama, M.
    Direct density-ratio estimation with dimensionality reduction via hetero-distributional subspace analysis.
    IEICE Technical Report, IBISML2011-1, pp.1-6, Tokyo, Japan, Jun. 20-21, 2011.

  74. Niu, G., Dai, B., Shang, L., & Sugiyama, M.
    Maximum volume clustering.
    In Proceedings of The 5th International Workshop on Data-Mining and Statistical Science (DMSS2011), Osaka, Japan, Mar. 29-30, 2011.

  75. Yamada, M., Niu, G., Takagi, J., & Sugiyama, M.
    Sufficient component analysis for supervised dimension reduction.
    In Proceedings of The 5th International Workshop on Data-Mining and Statistical Science (DMSS2011), Osaka, Japan, Mar. 29-30, 2011.

  76. Kimura, M. & Sugiyama, M.
    Dependence-maximization clustering with least-squares mutual information.
    In Proceedings of The 5th International Workshop on Data-Mining and Statistical Science (DMSS2011), Osaka, Japan, Mar. 29-30, 2011.

  77. Takeuchi, I. & Sugiyama, M.
    Adaptive target neighbor change for feature weighting in nearest neighbor classification.
    In Proceedings of The 5th International Workshop on Data-Mining and Statistical Science (DMSS2011), Osaka, Japan, Mar. 29-30, 2011.

  78. Hachiya, H., Sugiyama, M., & Ueda, N.
    Importance-weighted least-squares probabilistic classifier for covariate shift adaptation with application to human activity recognition.
    In Proceedings of The 5th International Workshop on Data-Mining and Statistical Science (DMSS2011), Osaka, Japan, Mar. 29-30, 2011.

  79. Suzuki, T., Tomioka, R., & Sugiyama, M.
    Fast convergence rate of multiple kernel learning with elastic-net regularization.
    IEICE Technical Report, IBISML2010-126, pp.153-160, Osaka, Japan, Mar. 28-29, 2011.

  80. Nakajima, S., Sugiyama, M., & Babacan, D.
    On automatic dimensionality selection in probabilistic PCA.
    IEICE Technical Report, IBISML2010-123, pp.131-138, Osaka, Japan, Mar. 28-29, 2011.

  81. Sugiyama, M., Yamada, M., Kimura, M., & Hachiya, H.
    Information-maximization clustering: Analytic solution and model selection.
    IEICE Technical Report, IBISML2010-114, pp.69-76, Osaka, Japan, Mar. 28-29, 2011.

  82. Kanamori, T., Suzuki, T., & Sugiyama, M.
    Statistical analysis of kernel-based density-ratio estimators.
    IEICE Technical Report, IBISML2010-110, pp.41-48, Osaka, Japan, Mar. 28-29, 2011.

  83. Mori, S., Sugiyama, M., Ogawa, H., Kitagawa, K., & Irie, K.
    Automatic parameter optimization of the local model fitting method for single-shot surface profiling.
    In Proceedings of the Japan Society for Precision Engineering 2011 Spring Meeting, pp.1031-1032, Tokyo, Japan, Mar. 14-16, 2011.

  84. Ihara, Y., Sugiyama, M., Ueki, K., & Fujita, M.
    Multi-race age classification by weighted combination of multiple classifiers.
    Dynamic Image Processing for Real Application (DIA2010), pp.317-322, Tokushima, Japan, Mar. 3-4, 2011.

  85. Hachiya, H., Sugiyama, M., & Ueda, N.
    Coping with new user problems: Transfer learning in accelerometer-based human activity recognition.
    NIPS 2010 Workshop on Transfer Learning by Learning Rich Generative Models, Whistler, British Columbia, Canada, Dec. 11, 2010.

  86. Yamanaka, M., Matsugu, M., & Sugiyama, M.
    Automatic detection of regions of interest using multiple visual saliency measures based on density ratio estimation.
    In Proceedings of Vision Engineering Workshop 2010 (ViEW2010), pp.7-8, Yokohama, Japan, Dec. 9-10, 2010.

  87. Takagi, J., Ohishi, Y., Kimura, A., Sugiyama, M., Yamada, M., & Kameoka, H.
    Automatic audio tagging and retrieval based on semi-supervised canonical density estimation.
    IEICE Technical Report, PRMU2010-126, pp.1-6, Yamaguchi, Japan, Dec. 9-10, 2010.

  88. Hachiya, H. & Sugiyama, M.
    Feature selection for reinforcement learning: Evaluating implicit state-reward dependency via conditional mutual information.
    Presented at the Second Asian Conference on Machine Learning (ACML2010), Tokyo, Japan, Nov. 8-10, 2010.

  89. Yamada, M. & Sugiyama, M.
    Dependence minimizing regression with model selection for non-linear causal inference under non-Gaussian noise.
    Presented at the Second Asian Conference on Machine Learning (ACML2010), Tokyo, Japan, Nov. 8-10, 2010.

  90. Nakajima, S. & Sugiyama, M.
    Model-induced regularization.
    Presented at the Second Asian Conference on Machine Learning (ACML2010), Tokyo, Japan, Nov. 8-10, 2010.

  91. Niu, G., Dai, B., Shang, L., & Sugiyama, M.
    Maximum volume clustering.
    Presented at the Second Asian Conference on Machine Learning (ACML2010), Tokyo, Japan, Nov. 8-10, 2010.

  92. Nakajima, S., Sugiyama, M., & Tomioka, R.
    Global analytic solution for variational Bayesian matrix factorization and its model-induced regularization.
    IEICE Technical Report, IBISML2010-99, pp.291-302, 2010.
    (Presented at 2010 Workshop on Information-Based Induction Sciences (IBIS2010), Tokyo, Japan, Nov. 4-6, 2010)

  93. Morimura, T., Sugiyama, M., Kashima, H., Hachiya, H., & Tanaka, T.
    Return density estimation with dynamic programming.
    IEICE Technical Report, IBISML2010-98, pp.283-290, 2010.
    (Presented at 2010 Workshop on Information-Based Induction Sciences (IBIS2010), Tokyo, Japan, Nov. 4-6, 2010)

  94. Sugiyama, M., Suzuki, T., & Kanamori, T.
    A unified framework of density ratio estimation under Bregman divergence.
    IEICE Technical Report, IBISML2010-64, pp.33-44, 2010.
    (Presented at 2010 Workshop on Information-Based Induction Sciences (IBIS2010), Tokyo, Japan, Nov. 4-6, 2010)

  95. Yamada, M. & Sugiyama, M.
    Cross-domain object matching via maximization of squared-loss mutual information
    IEICE Technical Report, IBISML2010-61, pp.13-18, 2010.
    (Presented at 2010 Workshop on Information-Based Induction Sciences (IBIS2010), Tokyo, Japan, Nov. 4-6, 2010)

  96. Takagi, J., Ohishi, Y., Kimura, A., Sugiyama, M., & Kameoka, H.
    Automatic audio tagging and retrieval based on semi-supervised canonical density estimation.
    Presented at 2010 Workshop on Information-Based Induction Sciences (IBIS2010), Tokyo, Japan, Nov. 4-6, 2010.

  97. Kanamori, T., Suzuki, T., & Sugiyama, M.
    Two-sample test by density ratio estimation.
    The 2010 Japanese Joint Statistical Meeting, p.52, Tokyo, Japan, Sep. 5-8, 2010.

  98. Suzuki, T., Tomioka, R., & Sugiyama, M.
    On asymptotic properties of multiple kernel learning with elasticnet-type regularization.
    The 2010 Japanese Joint Statistical Meeting, p.55, Tokyo, Japan, Sep. 5-8, 2010.

  99. Sugiyama, M., Suzuki, T., Itoh, Y., Kanamori, T., & Kimura, M.
    A density ratio approach to two-sample test.
    IEICE Technical Report, IBISML2010-48, pp.149-156, Fukuoka, Japan, Sep. 5-6, 2010.

  100. Simm, J., Sugiyama, M., & Kato, T.
    Multi-task learning with least-squares probabilistic classifiers.
    IEICE Technical Report, IBISML2010-33, pp.51-56, Fukuoka, Japan, Sep. 5-6, 2010.

  101. Yamada, M., Sugiyama, M., Wichern, G., & Simm, J.
    Improving the accuracy of least-squares probabilistic classifiers.
    IEICE Technical Report, IBISML2010-32, pp.45-50, Fukuoka, Japan, Sep. 5-6, 2010.

  102. Morimura, T., Sugiyama, M., Kashima, H., Hachiya, H., & Tanaka, T.
    Convergence analysis of dynamic programming for distributional Bellman equation.
    In Proceedings of Electronics, Information and Systems Conference, Electronics, Information and Systems Society, Institute of Electrical Engineers of Japan, pp.178-183, Kumamoto, Japan, Sep. 2-3, 2010.

  103. Kato, T., Kashima, H., Sugiyama, M., & Asai, K.
    Using local constraints for multi-task learning algorithm.
    In Proceedings of Meeting on Image Recognition and Understanding 2010 (MIRU2010), pp.1467-1474, Hokkaido, Japan, Jul. 27-29, 2010.

  104. Kimura, A., Nakano, T., Sugiyama, M., Kameoka, H., Maeda, E., & Sakano, H.
    SSCDE: Semi-supervised canonical density estimation for automatic image annotation retrieval.
    In Proceedings of Meeting on Image Recognition and Understanding 2010 (MIRU2010), pp.1396-1403, Hokkaido, Japan, Jul. 27-29, 2010.
    (This paper was selected for Best Interactive Session Award)

  105. Yamanaka, M., Matsugu, M., & Sugiyama, M.
    Application of density ratio estimation to region-of-interest detection in images.
    In Proceedings of Meeting on Image Recognition and Understanding 2010 (MIRU2010), pp.67-74, Hokkaido, Japan, Jul. 27-29, 2010.

  106. Kashima, H., Yamanishi, Y., Kato, T., Sugiyama, M., & Tsuda, K.
    Simultaneous inference of multiple biologinal networks.
    IPSJ SIG Technical Report, vol.2010-BIO-21, no.19, pp.1-8, Okinawa, Japan, Jun. 18-19, 2010.

  107. Yamada, M. & Sugiyama, M.
    Dependence minimizing regression with model selection for non-linear causal inference under non-Gaussian noise.
    IEICE Technical Report, IBISML2010-22, pp.145-151, Tokyo, Japan, Jun. 14-15, 2010.

  108. Hachiya, H. & Sugiyama, M.
    New feature selection method for reinforcement learning: Conditional mutual information reveals implicit state-reward dependency.
    IEICE Technical Report, IBISML2010-21, pp.137-144, Tokyo, Japan, Jun. 14-15, 2010.

  109. Sugiyama, M.
    Advances in statistical machine learning: An approach based on probability density ratios.
    IEICE Technical Report, IBISML2010-1, p.1, Tokyo, Japan, Jun. 14-15, 2010.

  110. Kanamori, T., Suzuki, T., & Sugiyama, M.
    Theoretical analysis of density ratio estimation.
    In Proceedings of the Japanese Society for Artificial Intelligence, 12th Meeting of Special Interest Group on Data Mining and Statistical Mathematics, SIG-DMSM-A903-11, pp.65-77, Tokyo, Japan, Mar. 29-30, 2010.

  111. Krämer, N., Sugiyama, M., & Braun, M.
    The degrees of freedom of partial least squares regression.
    Second Joint Statistical Meeting Deutsche Arbeitsgemeinschaft Statistik (DAGStat2010), p.217, Dortmund, Germany, Mar. 23-26, 2010.

  112. Wichern, G., Yamada, M., Harvey, T., Sugiyama, M., & Andreas, S.
    Automatic audio tagging using covariate shift adaptation.
    In Proceedings of Acoustical Society of Japan 2010 Spring Meeting, no.2-8-19, pp.989-990, Tokyo, Japan, Mar. 8-10, 2010.

  113. Kurihara, N., Sugiyama, M., Ogawa, H., Kitagawa, K., & Suzuki, K.
    One-shot surface profiling by weighted local model fitting.
    In Proceedings of Dynamic Image Processing for Real Application (DIA2010), pp.249-254, Yamanashi, Japan, Mar. 4-5, 2010.

  114. Kato, T., Kashima, H., Sugiyama, M., & Asai, K.
    An SOCP formulation for multi-task learning.
    IPSJ SIG Technical Report, vol.2010-MPS-77, no.8, pp.1-6, Shizuoka, Japan, Mar. 4-5, 2010.

  115. Sugiyama, M.
    Superfast probabilistic classifier.
    IEICE Technical Report, CQ2009-74, pp.127-132, Kyoto, Japan, Jan. 21-22, 2010.
    (This paper was selected for Insentive Award)

  116. Kudo, M., Imai, H., Tanaka, A., & Sugiyama, M.
    Urban legends in pattern recognition.
    IEICE Technical Report, PRMU2009-142, pp.29-34, Tochigi, Japan, Dec. 17-18, 2009.

  117. Sugiyama, M., Takeuchi, I., Suzuki, T., Kanamori, T., Hachiya, H., & Okanohara, D.
    Conditional density estimation based on density ratio estimation.
    IPSJ SIG Technical Report, vol.2009-MPS-76, no.4, pp.1-8, Tokyo, Japan, Dec. 17-18, 2009.

  118. Simm, J., Sugiyama, M., & Hachiya, H.
    Improving model-based reinforcement learning with multitask learning.
    IPSJ SIG Technical Report, vol.2009-MPS-76, no.3, pp.1-8, Tokyo, Japan, Dec. 17-18, 2009.

  119. Tomioka, R., Suzuki, T., & Sugiyama, M.
    Super-linear convergence of dual augmented Lagrangian algorithm for sparse learning.
    2nd NIPS Workshop on Optimization for Machine Learning (OPT2009), Whistler, British Columbia, Canada, Dec. 12, 2009.

  120. Ihara, Y., Sugiyama, M., & Ueki, K.
    Age estimation using covariate shift adaptation.
    In Proceedings of Vision Engineering Workshop 2009 (ViEW2009), pp.325-330, Yokohama, Japan, Dec. 3-4, 2009.

  121. Sugiyama, M., Hara, S., von Bünau, P., Suzuki, T., Kanamori, T., & Kawanabe, M.
    Dimensionality reduction for density ratio estimation based on Pearson divergence maximization.
    Presented at 2009 Workshop on Information-Based Induction Sciences (IBIS2009), Fukuoka, Japan, Oct. 19-21, 2009.

  122. Kimura, A., Kameoka, H., Sugiyama, M., Maeda, E., Sakano, H., & Ishiguro, K.
    SemiCCA: Efficient semi-supervised learning of canonical correlations.
    Presented at 2009 Workshop on Information-Based Induction Sciences (IBIS2009), Fukuoka, Japan, Oct. 19-21, 2009.

  123. Kato, T., Kashima, H., Sugiyama, M., & Asai, K.
    Second-order cone programming for multi-task learning.
    Presented at 2009 Workshop on Information-Based Induction Sciences (IBIS2009), Fukuoka, Japan, Oct. 19-21, 2009.

  124. Morimura, T., Sugiyama, M., Kashima, H., Hachiya, H., & Tanaka, T.
    Return distribution estimation for risk-sensitive reinforcement learning.
    Presented at 2009 Workshop on Information-Based Induction Sciences (IBIS2009), Fukuoka, Japan, Oct. 19-21, 2009.

  125. Simm, J., Sugiyama, M., & Hachiya, H.
    Observational reinforcement learning.
    Technical Report on Information-Based Induction Sciences 2009 (IBIS2009), pp.120-127, Fukuoka, Japan, Oct. 19-21, 2009.

  126. Yamada, M., Wichern, G., Sugiyama, M., & Kondo, K.
    Semi-blind source separation under ambient noise condition change.
    In Proceedings of Acoustical Society of Japan 2009 Autumn Meeting, pp.751-754, Fukushima, Japan, Sep. 15-17, 2009.

  127. Suzuki, T. & Sugiyama, M.
    Sufficient dimension reduction via squared-loss mutual information estimation.
    The 2009 Japanese Joint Statistical Meeting, p.310, Kyoto, Japan, Sep. 6-9, 2009.

  128. Kanamori, T., Suzuki, T., & Sugiyama, M.
    Condition number analysis of density ratio estimation.
    The 2009 Japanese Joint Statistical Meeting, p.163, Kyoto, Japan, Sep. 6-9, 2009. (in Japanese)

  129. Tomioka, R., Suzuki, T., & Sugiyama, M.
    Optimization algorithms for sparse regularization and multiple kernel learning and their applications to CV/PR.
    IEICE Technical Report, PRMU2009-63, pp.43-48, Sendai, Japan, Aug. 31-Sep. 1, 2009.
    [ demo (by Ryota Tomioka) ]

  130. Takeda, A. & Sugiyama, M.
    Non-convex optimization of extended nu-support vector machine.
    Presented at the 20th International Symposium on Mathematical Programming (ISMP2009), Chicago, Illinois, USA, Aug. 23-28, 2009.

  131. Ueki, K., Sugiyama, M., & Ihara, Y.
    Active sample selection and weighted semi-supervised regression for perceived age estimation.
    In Proceedings of Meeting on Image Recognition and Understanding 2009 (MIRU2009), pp.260-265, Shimane, Japan, Jul. 20-22, 2009.

  132. Suzuki, T. & Sugiyama, M.
    Sufficient dimension reduction via squared-loss mutual information estimation.
    In Proceedings of The Fourth International Workshop on Data-Mining and Statistical Science (DMSS2009), pp.68-77, Kyoto, Japan, Jul. 7-8, 2009.

  133. Sugiyama, M., Kawanabe, M., & Chui, P. L.
    Dimensionality reduction for density ratio estimation in high-dimensional spaces.
    In Proceedings of The Fourth International Workshop on Data-Mining and Statistical Science (DMSS2009), pp.31-67, Kyoto, Japan, Jul. 7-8, 2009.

  134. Takimoto, M., Matsugu, M., & Sugiyama, M.
    Visual inspection of precision instruments by least-squares outlier detection.
    In Proceedings of The Fourth International Workshop on Data-Mining and Statistical Science (DMSS2009), pp.22-26, Kyoto, Japan, Jul. 7-8, 2009.

  135. Akiyama, T., Hachiya, H., & Sugiyama, M.
    Efficient exploration through active learning for value function approximation in reinforcement learning.
    In Proceedings of The Fourth International Workshop on Data-Mining and Statistical Science (DMSS2009), pp.1-21, Kyoto, Japan, Jul. 7-8, 2009.
    (This paper was selected for JSAI Incentive Award)

  136. Kanamori, T., Suzuki, T., & Sugiyama, M.
    Condition number analysis of kernel-based density ratio estimation.
    Presented at Numerical Mathematics in Machine Learning (NUMML2009), Montreal, Quebec, Canada, Jun. 18, 2009.

  137. Ueki, K., Sugiyama, M., & Ihara, Y.
    Perceived age estimation using weighted regression.
    In Proceedings of Symposium on Sensing via Image Information (SSII09), no.IS4-23 (CD-ROM), Yokohama, Japan, Jun. 10-12, 2009.

  138. Hachiya, H., Akiyama, T., Sugiyama, M., & Peters, J.
    Efficient data reuse in value function approximation.
    In 2009 IEEE Symposium on Adaptive Dynamic Programming and Reinforcement Learning (ADPRL2009) Proceedings, pp.8-15, Nashville, Tennessee, USA, Mar. 29-Apr. 2, 2009.

  139. Yamada, M., Sugiyama, M., & Matsui, T.
    Covariate shift adaptation for speaker identification.
    In Proceedings of Acoustical Society of Japan 2009 Spring Meeting, no.2-5-13, pp.77-78, Tokyo, Japan, Mar. 17-19, 2009.

  140. Sugiyama, M., Kanamori, T., Suzuki, T., Hido, S., Sese, J., Takeuchi, I., & Wang, L.
    Methods and applications of density ratio estimation,
    In Proceedings of Acoustical Society of Japan 2009 Spring Meeting, no.2-5-12, pp.73-76, Tokyo, Japan, Mar. 17-19, 2009.

  141. Kashima, H., Kato, T., Yamanishi, Y., Sugiyama, M., & Tsuda, K.
    Link propagation: A fast semi-supervised learning algorithm for link prediction.
    In Proceedings of the Japanese Society for Artificial Intelligence, 73rd Meeting of Special Interest Group on Fundamental Problem in Artificial Intelligence, pp.19-24, Tokyo, Japan, Mar. 13-14, 2009.

  142. Akiyama, T., Hachiya, H., & Sugiyama, M.
    Statistical active learning for efficient value function approximation in reinforcement learning.
    IEICE Technical Report, NC2008-147, pp.261-266, Tokyo, Japan, Mar. 11-13, 2009.

  143. Hachiya, H., Peters, J., & Sugiyama, M.
    Adaptive importance sampling with automatic model selection in reward weighted regression.
    IEICE Technical Report, NC2008-145, pp.249-254, Tokyo, Japan, Mar. 11-13, 2009.
    (Based on this work, Hirotaka Hachiya was awarded the Young Researcher Award from IEEE Computational Intelligence Society Japan Chapter)

  144. Suzuki, T. & Sugiyama, M.
    Independent component analysis by direct density-ratio estimation.
    IEICE Technical Report, NC2008-136, pp.195-199, Tokyo, Japan, Mar. 11-13, 2009.

  145. Yokota, T., Sugiyama, M., Ogawa, H., Kitagawa, K., & Suzuki, K.
    Error analysis of local model fitting method in single-shot surface profiling.
    In Proceedings of the Japan Society for Precision Engineering, 2009 Spring Meeting, no.C02, pp.167-168, Tokyo, Japan, Mar. 11-13, 2009

  146. Sugiyama, M., Hachiya, H., & Akiyama, T.
    Robot control by reinforcement learning: A machine-learning approach.
    In Proceedings of the Society of Instrument and Control Engineers, the 9th Control Division Conference, no.FC1-3, Hiroshima, Japan, Mar. 4-6, 2009.

  147. Sugiyama, M.
    Document classification by local Fisher discriminant analysis.
    IEICE Technical Report, PRMU2008-225, pp.105-108, Tokyo, Japan, Feb. 19-20, 2009.

  148. Rubens, N., Tomioka, R., & Sugiyama, M.
    Output divergence criterion for active learning in collaborative settings.
    In Proceedings of IPSJ SIG Mathematical Modelling and Problem Solving, no.126, pp.65-68, Osaka, Japan, Dec. 17-18, 2008.

  149. Tomioka, R. & Sugiyama, M.
    Sparse learning with duality gap guarantee.
    Presented at NIPS2008 Workshop on Optimization for Machine Learning (OPT2008), Whistler, British Columbia, Canada, Dec. 12-13, 2008.

  150. Jankovic, M., Sugiyama, M., & Reljin, B.
    Tensor Based Image Segmentation.
    In B. Reljin and S. Stankovic (Eds.), Ninth Symposium on Neural Networks Applications in Electrical Engineering (NEUREL2008), pp.145-148, Belgrade, Serbia, Sep. 25-27, 2008.

  151. Yamada, M. & Sugiyama, M.
    Semi-supervised speaker identification under covariate shift.
    In Proceedings of The Third International Workshop on Data-Mining and Statistical Science (DMSS2008), pp.55-58, Tokyo, Japan, Sep. 25-26, 2008.

  152. Kato, T., Kashima, H., & Sugiyama, M.
    Using product-of-Student-t for label propagation on multiple networks.
    In Proceedings of The Third International Workshop on Data-Mining and Statistical Science (DMSS2008), pp.20-23, Tokyo, Japan, Sep. 25-26, 2008.

  153. Kato, T., Kashima, H., & Sugiyama, M.
    Protein function prediction by integration of heterogenous biological networks.
    In Proceedings of Information Processing Society of Japan (IPSJ), Special Interest Group on Bioinformatics and Genomics (SIG BIO), vol.2008, no.86, pp.47-50, Sapporo, Japan, Sep. 18-19, 2008.

  154. Naito, T., Sugiyama, M., Ogawa, H., Kitagawa, K., & Suzuki, K.
    Single-shot interferometry of film-covered objects: local model fitting for simultaneous measurement of film thickness and surface profile of film-covered objects.
    In Proceedings of The Japan Society for Precision Engineering 2008 Autumn Semestrial Conference, no.C33, pp.183-184, Sendai, Japan, Sep. 17-19, 2008.

  155. Suzuki, T., Sugiyama, M., Sese, J., & Kanamori, T.
    A least-squares approach to mutual information estimation with application in variable selection.
    In Proceedings of Workshop on New Challenges for Feature Selection in Data Mining and Knowledge Discovery 2008 (FSDM2008), Antwerp, Belgium, Sep. 15, 2008.

  156. Kanamori, T., Hido, S., & Sugiyama, M.
    Learning and density ratio estimation under covariate shift.
    The 2008 Japanese Joint Statistical Meeting, p.196, Yokohama, Japan, Sep. 7-10, 2008.

  157. Sugiyama, M., Kanamori, T., Suzuki, T., Hido, S., Sese, J., Takeuchi, I., & Wang, L.
    Direct importance estimation---A new versatile tool for statistical pattern recognition.
    In Proceedings of Meeting on Image Recognition and Understanding 2008 (MIRU2008), pp.29-36, Nagano, Japan, Jul. 29-31, 2008.
    (This paper was selected for Best Paper Runner-up Award)

  158. Suzuki, K., Ogawa, H., Kitagawa, K., & Sugiyama, M.
    Two-wavelength single-shot interferometry for precise surface profiling.
    In Proceedings of Optical Measurement Symposium 2008, pp.35-38, Yokohama, Japan, Jun. 11, 2008.

  159. Akiyama, T., Hachiya, H., & Sugiyama, M.
    A new method of model selection for value function approximation in reinforcement learning.
    In Proceedings of the Japanese Society for Artificial Intelligence, 6th Meeting of Special Interest Group on Data Mining and Statistical Mathematics, SIG-DMSM-A703-09, pp.55-60, Osaka, Japan, Feb. 28-29, 2008.

  160. Hachiya, H., Akiyama, T., & Sugiyama, M.
    Adaptive importance sampling with automatic model selection in value function approximation.
    IEICE Technical Report, NC2007-84, pp.75-80, Nagoya, Japan, Dec. 22, 2007.

  161. Hachiya, H., Akiyama, T., & Sugiyama, M.
    Efficient sample reuse by covariate shift adaptation in value function approximation.
    Presented at NIPS2007 Workshop on Robotics Challenges for Machine Learning, Whistler, British Columbia, Canada, Dec. 7, 2007.

  162. Kitagawa, K., Sugiyama, M., Matsuzaka, T., Ogawa, H., & Suzuki, K.
    Two-wavelength single-shot interferometry.
    In Proceedings of Vision Engineering Workshop 2007 (ViEW2007), pp.189-194, Yokohama, Japan, Dec. 6-7, 2007.
    (This paper was selected for Odawara Award 2nd Prize)

  163. Sugiyama, M., Idé, T., Nakajima, S., & Sese, J.
    Semi-supervised local Fisher discriminant analysis for dimensionality reduction.
    In Proceedings of 2007 Workshop on Information-Based Induction Sciences (IBIS2007), pp.1-6, Yokohama, Japan, Nov. 5-7, 2007.

  164. Hido, S., Tsuboi, Y., Kashima, H., & Sugiyama, M.
    Novelty detection by density ratio estimation.
    In Proceedings of 2007 Workshop on Information-Based Induction Sciences (IBIS2007), pp.197-204, Yokohama, Japan, Nov. 5-7, 2007.

  165. Wang, L. & Sugiyama, M.
    Equilibrium margin---A new concept for characterizing generalization error of voting classifiers
    In Proceedings of 2007 Workshop on Information-Based Induction Sciences (IBIS2007), pp.49-54, Yokohama, Japan, Nov. 5-7, 2007.

  166. Kato, T., Kashima, H., Sugiyama, M., & Asai, K.
    Probabilistic label propagation on multiple networks.
    In Proceedings of 2007 Workshop on Information-Based Induction Sciences (IBIS2007), pp.43-48, Yokohama, Japan, Nov. 5-7, 2007.

  167. Rubens, N. & Sugiyama, M.
    Explorative active learning for collaborative filtering.
    In Proceedings of the Japanese Society for Artificial Intelligence, 67th Meeting of Special Interest Group on Fundamental Problem in Artificial Intelligence, pp.1-5, Yokohama, Japan, Nov. 3-4, 2007.

  168. Sugiyama, M., Nakajima, S., Kashima, H., von Bünau, P., & Kawanabe, M.
    Kullback-Leibler importance estimation procedure for covariate shift adaptation.
    In Proceedings of the International Workshop on Data-Mining and Statistical Sciences (DMSS2007) , pp.31-49, Tokyo, Japan, Oct. 5-6, 2007.

  169. Kitagawa, K., Sugiyama, M., Matsuzaka, T., Ogawa, H., & Suzuki, K.
    Two-wavelength single-shot interferometry.
    In Proceedings of the Society of Instrument and Control Engineers Annual Conference (SICE2007), pp.724-728, Takamatsu, Japan, Sep. 17-20, 2007.

  170. Hachiya, H. & Sugiyama, M.
    Robot control by least-squares policy iteration with geodesic Gaussian kernels.
    In Proceedings of The 21st Annual Conference of The Japanese Society for Artificial Intelligence (JSAI2007), no.3D9-2, Miyazaki, Japan, Jun. 18-22, 2007.

  171. Kitamura, Y. & Sugiyama, M.
    Dimensionality reduction of partially labeled multimodal data.
    In Proceedings of The 21st Annual Conference of The Japanese Society for Artificial Intelligence (JSAI2007), no.3D6-1, Miyazaki, Japan, Jun. 18-22, 2007.

  172. Sugiyama, M.
    Supervised learning under covariate shift.
    13th Symposium on Sensing via Image Information, Yokohama, Japan, Jun. 11-13, 2007.

  173. Sugiyama, M., Kawanabe, M., Blanchard, G., Spokoiny, V., & Müller, K.-R.
    Approximating the best linear unbiased estimator of non-Gaussian signals with Gaussian noise.
    Technical Report TR07-0001, Department of Computer Science, Tokyo Institute of Technology, Tokyo, Japan, 2007.

  174. Sugiyama, M., Matsuzaka, T., Ogawa, H., Kitagawa, K., & Suzuki, K.
    One-shot profiling of sharp bumpy surfaces.
    In Proceedings of the Japan Society for Precision Engineering 2007 Spring Meeting, no.G07, pp.586-587, Tokyo, Japan, Mar. 20-22, 2007.

  175. Sugiyama, M.
    Local Fisher discriminant analysis for dimensionality reduction.
    In Proceedings of the Japanese Society for Artificial Intelligence, 3rd Meeting of Special Interest Group on Data Mining and Statistical Mathematics, SIG-DMSM-A603-04, pp.19-26, Kobe, Japan, Feb. 27-28, 2007.
    (This paper was selected for JSAI Incentive Award)

  176. Sugiyama, M.
    Active learning, model selection, and covariate shift.
    NIPS2006 Workshop on Learning when test and training inputs have different distributions, Whistler, British Columbia, Canada, Dec. 9, 2006.

  177. Gokita, S., Sugiyama, M., & Sakurai, K.
    Adaptive ridge learning in kernel eigenspace and its model selection.
    IEICE Technical Report, NC2006-97, pp.55-60, Hokkaido, Japan, Jan, 25-26, 2007.

  178. Hidaka, Y. & Sugiyama, M.
    A new meta-criterion for regularized subspace information criterion.
    IEICE Technical Report, NC2006-96, pp.49-54, Hokkaido, Japan, Jan, 25-26, 2007.

  179. Sugiyama, M., Hachiya, H., Towell, C., & Vijayakumar, S.
    Geodesic Gaussian kernels for value function approximation.
    In Proceedings of 2006 Workshop on Information-Based Induction Sciences (IBIS2006), pp.316-321, Osaka, Japan, Oct. 31-Nov. 2, 2006.

  180. Rubens, N. & Sugiyama, M.
    Coping with active learning with model selection dilemma: Minimizing expected generalization error.
    In Proceedings of 2006 Workshop on Information-Based Induction Sciences (IBIS2006), pp.310-315, Osaka, Japan, Oct. 31-Nov. 2, 2006.

  181. Sugiyama, M., Krauledat, M., & Müller, K.-R.
    A method of covariate shift adaptation with application to brain-computer interfacing.
    In Proceedings of 2006 Workshop on Information-Based Induction Sciences (IBIS2006), pp.71-76, Osaka, Japan, Oct. 31-Nov. 2, 2006.

  182. Sugiyama, M.
    Local Fisher discriminant analysis.
    In Proceedings of Subspace2006, pp.85-100, Sendai, Japan, Sep. 18, 2006.

  183. Sugiyama, M., Blankertz, B. Krauledat, M., Donehege, G., & Müller, K.-R.
    Compensating non-stationarity in brain computer interfaces through covariate shift adaptation.
    Presented at 2006 Japan-Germany Symposium on Computational Neuroscience, Saitama, Japan, Feb. 1-4, 2006.

  184. Shinada, Y. & Sugiyama, M.
    Embedding of labeled multimodal data.
    IEICE Technical Report, NC2005-102, pp.25-30, Sapporo, Japan, Jan, 23-24, 2006.

  185. Kawanabe, M., Blanchard, G., Sugiyama, M., Spokoiny, V., & Müller, K.-R.
    In search of non-Gaussian components of a high-dimensional distribution.
    In Proceedings of 2nd International Symposium on Information Geometry and its Applications (IGAIA2005), pp.109-116, Tokyo, Japan, Dec. 12-16, 2005.

  186. Sugiyama, M.
    An active learning algorithm for approximately correct models.
    In Proceedings of 2005 Workshop on Information-Based Induction Sciences (IBIS2005), pp.57-62, Tokyo, Japan, Nov. 9-11, 2005.

  187. Sugiyama, M. & Müller, K.-R.
    Generalization error estimation under covariate shift.
    In Proceedings of 2005 Workshop on Information-Based Induction Sciences (IBIS2005), pp.21-26, Tokyo, Japan, Nov. 9-11, 2005.

  188. Müller, K.-R., Sugiyama, M., Shenoy, P., & Krauledat, M.
    Input-dependent estimation of generalization error under covariate shift.
    Presented at PASCAL Workshop on Modelling in Classification and Statistical Learning, Eindhoven, The Netherlands, Oct. 3-5, 2005

  189. Hanhijärvi, S. & Sugiyama, M.
    A method of active learning with model selection.
    IEICE Technical Report, NC2005-36, pp.37-42, Tokyo, Japan, Jul. 27, 2005.

  190. Sakurai, K. & Sugiyama, M.
    Analytic model optimization using a regularized generalization error estimator.
    In Proceedings of Meeting on Image Recognition and Understanding 2005 (MIRU2005), pp.1013-1020, Hyogo, Japan, Jul. 18-20, 2005.

  191. Blanchard, G., Kawanabe, M., Sugiyama, M., Spokoiny, V., & Müller, K.-R.
    Finding interesting parts of multidimensional data via identification of non-Gaussian linear subspaces.
    Presentation at The Learning Workshop, Snowbird, Utah, USA, Apr. 5-8, 2005.

  192. Sugiyama, M. & Müller, K.-R.
    Generalization error estimation when training and test input points follow different probability distributions.
    IEICE Technical Report, NC2004-215, pp.129-134, Tokyo, Japan, Mar. 28-30, 2005.

  193. Sugiyama, M., Kawanabe, M., Blanchard, G., Spokoiny, V., & Müller, K.-R.
    A semiparametric approach to identifying non-Gaussian components in high dimensional data
    In Proceedings of International Symposium on the Art of Statistical Metaware (Mateware2005), pp.296-297, Tokyo, Japan, Mar. 14-16, 2005.

  194. Kawanabe, M., Spokoiny, V., Blanchard, G., Sugiyama, M., & Müller, K.-R.
    In search of non-Gaussian components of a high-dimensional distribution.
    Presented at Subspace, Latent Structure and Feature Selection techniques: Statistical and Optimisation perspectives Workshop, PASCAL Network, Bohinj, Slovenia, Feb. 23-25, 2005.

  195. Sugiyama, M., Kambe, K., & Ogawa, H.
    Restoration of printed images based on degradation models
    Technical Report TR04-0003, Department of Computer Science, Tokyo Institute of Technology, Tokyo, Japan, 2004.

  196. Kawanabe, M., Spokoiny, V., Blanchard, G., Sugiyama, M., & Müller, K.-R.
    Finding interesting parts of multidimensional data: How to determine non-Gaussian linear subspaces.
    In J. Fan, K.-R., Müller, and V. Spokoiny (Eds.), New Inference Concepts for Analysing Complex Data, vol.447, Mathematisches Forshungsinstitut Oberwolfach, Oberwolfach, Germany, Nov. 14-20, 2004.

  197. Ogawa, H., Nakanowatari, A., Kitagawa, K., & Sugiyama, M.
    3-D profiling of film-covered objects using phase-shifting interferometry.
    In Proceedings of the 2004 Autumn Meeting of the Japan Society for Precision Engineering, pp.1125-1126, Shimane, Japan, Sep. 15-17, 2004. (in Japanese)

  198. Sugiyama, M. & Nishihara, A.
    DSP Education at Department of Computer Science, Tokyo Institute of Technology.
    In Proceedings of 5th DSPS Educators Conference, pp.3-6, Tokyo, Japan, Sep. 17-18, 2003.

  199. Kambe, K., Sugiyama, M., & Ogawa, H.
    Restoration of degraded print images.
    In Proceedings of the 2003 IEICE General Conference D-11-97, p.97, Sendai, Japan, Mar. 19-22, 2003.

  200. Okabe, Y., Sugiyama, M., & Ogawa, H.
    Generalization error estimation in the presence of training input noise.
    In Proceedings of the 2003 IEICE General Conference D-2-6, p.12, Sendai, Japan, Mar. 19-22, 2003.

  201. Sugiyama, M., Kawanabe, M., & Müller, K.-R.
    Regularization approach to improving an unbiased generalization error estimator.
    IEICE Technical Report, NC2002-195, pp.131-136, Tokyo, Japan, Mar. 17-19, 2003.

  202. Sugiyama, M., Fujino, M., & Müller, K.-R.
    A new kernel for binary regression.
    IEICE Technical Report, NC2002-150, pp.101-106, Tokyo, Japan, Mar. 17-19, 2003.

  203. Sugiyama, M. & Ogawa, H.
    On variance of subspace information criterion.
    In Proceedings of 2002 Annual Conference of Japanese Neural Network Society (JNNS2002), pp.105-108, Tottori, Japan, Sep. 19-21, 2002.

  204. Sugiyama, M.
    Unbiased estimation of generalization error for kernel regression.
    NATO Advanced Science Institute on Learning Theory and Practice (LTP2002), Leuven, Belgium, Jul. 8-19, 2002.

  205. Tanaka, S., Sugiyama, M., & Ogawa, H.
    Theoretical evaluation of corrected subspace information criterion for model selection.
    In Proceedings of the 2002 IEICE General Conference D-2-2, p.11, Tokyo, Japan, Mar. 27-30, 2002.

  206. Sugiyama, M. & Müller, K.-R.
    Ridge parameter determination in infinite dimensional hypothesis spaces.
    IEICE Technical Report, NC2001-135, pp.21-28, Tokyo, Japan, Mar. 18-20, 2002.

  207. Sugiyama, M.
    From learning the whole rule to estimating a value at a point of interest.
    The Brain & Neural Networks, vol.9, no.1, pp.77-78, 2002.

  208. Sugiyama, M. & Ogawa, H.
    Subspace information criterion---Determining parameters in linear filters for optimal restoration.
    In Proceedings of the 16th Digital Signal Processing Symposium, pp.47-52, Okinawa, Japan, Nov. 7-9, 2001.

  209. Sugiyama, M. & Ogawa, H.
    Optimal design of ridge parameter.
    In Proceedings of 2001 Annual Conference of Japanese Neural Network Society (JNNS2001 Nara), pp.9-10, Nara, Japan, Sep. 27-29, 2001.

  210. Sugiyama, M. & Ogawa, H.
    Optimal design of regularization parameter in linear regression.
    Presented at Highdimensional Nonlinear Statistical Modeling, Wulkow, Germany, Sep. 15-19, 2001.

  211. Tsuda, K., Sugiyama, M., & Müller, K.-R.
    Subspace information criterion for sparse regressors.
    In Proceedings of 2001 Workshop on Information-Based Induction Sciences (IBIS2001), pp.183-188, Tokyo, Japan, Jul. 30-Aug. 1, 2001.

  212. Sugiyama, M., Imaizumi, D., & Ogawa, H.
    Image restoration with subspace information criterion---Optimizing parameters of linear filters.
    In Proceedings of 2001 Workshop on Information-Based Induction Sciences (IBIS2001), pp.77-82, Tokyo, Japan, Jul. 30-Aug. 1, 2001.

  213. Moro, S. & Sugiyama, M.
    Estimation of precipitation from meteorological radar data.
    In Proceedings of the 2001 IEICE General Conference SD-1-10, pp.264-265, Shiga, Japan, Mar. 26-29, 2001.
    (This paper won the 1st prize at 2001 IEICE Precipitation Estimation Contest, see here for detail)

  214. Imaizumi, D., Sugiyama, M., & Ogawa, H.
    Parameter optimization for image restoration filters by subspace information criterion.
    IEICE Technical Report, PRMU2000-243, pp.153-160, Fukuoka, Japan, Mar. 15-16, 2001.

  215. Sugiyama, M. & Ogawa, H.
    Subspace information criterion---Unbiased generalization error estimator for linear regression.
    Presented at NIPS2000 Workshop on Cross-Validation, Bootstrap and Model Selection, Breckenridge, Colorado, USA, Nov. 30-Dec. 2, 2000.

  216. Sugiyama, M. & Ogawa, H.
    Optimal estimation of values of functions at points of interest by model selection.
    In Proceedings of the Joint Meeting of 23rd Annual Meeting of Japan Neuroscience Society and 10th Annual Meeting of Japanese Neural Network Society, p.197, Yokohama, Japan, Sep. 4-6, 2000.
    (This paper was selected for 2001 JNNS Encouragement Award)

  217. Sugiyama, M. & Ogawa, H.
    Active learning with model selection for optimal generalization.
    In Proceedings of 2000 Workshop on Information-Based Induction Sciences (IBIS2000), pp.87-92, Shizuoka, Japan, Jul. 17-18, 2000.

  218. Sugiyama, M. & Ogawa, H.
    Simultaneous optimization of sample points and models.
    IEICE Technical Report, NC2000-26, pp.17-24, Okinawa, Japan, Jun. 22-23, 2000.

  219. Yamaguchi, K., Sugiyama, M., & Ogawa, H.
    Projection learning based handwritten numeral recognition.
    In Proceedings of the 2000 IEICE General Conference, D-12-10, p.180, Hiroshima, Japan, Mar. 28-31, 2000.

  220. Sugiyama, M. & Ogawa, H.
    Incremental active learning for optimal data selection.
    In Proceedings of the 2000 IEICE General Conference, D-2-2, p.11, Hiroshima, Japan, Mar. 28-31, 2000.

  221. Sugiyama, M. & Ogawa, H.
    Bias estimation and model selection.
    IEICE Technical Report, NC99-81, pp.9-16, Sapporo, Japan, Feb. 3-4, 2000.

  222. Sugiyama, M. & Ogawa, H.
    Active learning for optimal generalization.
    In Proceedings of the 10th Tokyo Institute of Technology Brain Research Symposium, pp.20-27, Tokyo, Japan, Dec. 10, 1999.

  223. Sugiyama, M. & Ogawa, H.
    Incremental active learning in consideration of bias.
    IEICE Technical Report, NC99-56, pp.15-22, Fukuoka, Japan, Nov. 26, 1999.

  224. Nishi, E., Sugiyama, M., & Ogawa, H.
    Incremental learning for optimal generalization in a family of projection learnings.
    IEICE Technical Report, NC99-55, pp.7-14, Fukuoka, Japan, Nov. 26, 1999.

  225. Sugiyama, M. & Ogawa, H.
    On the selection of subspace models.
    In Proceedings of 1999 Annual Conference of Japanese Neural Network Society (JNNS1999),
    pp.175-176, Sapporo, Japan, Sep. 20-22, 1999.

  226. Sugiyama, M. & Ogawa, H.
    Functional analytic approach to model selection---Subspace information criterion.
    In Proceedings of 1999 Workshop on Information-Based Induction Sciences (IBIS'99), pp 93-98, Shizuoka, Japan, Aug. 26-27, 1999.

  227. Sugiyama, M. & Ogawa, H.
    Active learning in trigonometric polynomial neural networks.
    In Proceedings of the 1999 IEICE General Conference, D-2-26, p.33, Kanagawa, Japan, Mar. 25-28, 1999.
    (This paper was selected for 1999 IEICE Academic Encouragement Award)

  228. Nakashima, A., Sugiyama, M., & Ogawa, H.
    Projection learning as an extension of best linear unbiased estimation.
    In Proceedings of the 1999 IEICE General Conference, D-2-24, p.31, Yokohama, Japan, Mar. 25-28, 1999.

  229. Sugiyama, M. & Ogawa, H.
    Exact incremental projection learning in neural networks.
    IEICE Technical Report, NC98-97, pp.149-156, Sapporo, Japan, Feb. 5, 1999.

  230. Sugiyama, M. & Ogawa, H.
    Training data selection for optimal generalization in a trigonometric polynomial model.
    IEICE Technical Report, NC98-50, pp.55-62, Fukuoka, Japan, Oct. 24, 1998.

  231. Sugiyama, M. & Ogawa, H.
    Active learning for noise suppression.
    IEICE Technical Report, NC98-21, pp.87-94, Okinawa, Japan, Jun. 18-19, 1998.

  232. Sugiyama, M. & Ogawa, H.
    Incremental projection learning in the presence of noise.
    In Proceedings of the 1998 IEICE General Conference, D-2-17, p.23, Yokohama, Japan, Mar. 27-30, 1998.

  233. Sugiyama, M. & Ogawa, H.
    Incremental projection learning for optimal generalization.
    IEICE Technical Report, NC97-145, pp.47-54, Tokyo, Japan, Mar. 19-20, 1998.


Patents

  1. Ueki, K., Ihara, Y., & Sugiyama, M.
    Attribute value estimation device, attribute value estimatino method, program, and recording medium.
    Unexamined publication: USA US-2013-0254143-A1 (Sep. 26, 2013)
    Unexamined publication: Europe EP2-650-842-A1 (Oct. 16, 2013)

  2. Kimura, A., Kameoka, H., Sakano, H., & Sugiyama, M.
    Image additional information relation learning apparatus, method, and program.
    Unexamined publication: Japan 2013-105393 (May 30, 2013)

  3. Kimura, A., Kameoka, H., Maeda, E., Sakano, H., Ishiguro, K., & Sugiyama, M.
    Semi-supervised signal annotation retrieval apparatus, semi-supervised signal annotation retrieval method and program.
    Publication: Japan 5499362 (Mar. 20, 2014)

  4. Kimura, A., Kameoka, H., Maeda, E., Sakano, H., Ishiguro, K., & Sugiyama, M.
    Semi-supervised topic model learning apparatus, semi-supervised topic model learning method and program.
    Publication: Japan 5499361 (Mar. 20, 2014)

  5. Hirata, T., Kawahara, Y., & Sugiyama, M.
    Automatic pattern extraction method and automatic pattern extraction system.
    Unexamined publication: Japan 2011-247696 (Dec. 8, 2011)

  6. Ueki, K. & Sugiyama, M.
    Target variable calculation system, target variable calculation method, program and recording media.
    Unexamined publication: Japan 2011-141740 (Jul. 21, 2011)

  7. Ueki, K., Ihara, Y., & Sugiyama, M.
    Target variable calculation system, target variable calculation method, program and recording media.
    Unexamined publication: Japan 2011-070471 (Apr. 7, 2010)

  8. Sugiyama, M., Yokota, T., Ogawa, H., Kitagawa, K., & Suzuki, K.
    Method and equipment for surface profiling.
    Unexamined publication: Japan 2010-185844 (Aug. 26, 2010)

  9. Sugiyama, M., Naito, T., Ogawa, H., Kitagawa, K., & Suzuki, K.
    Method and equipment for surface and/or film thickness profiling.
    Unexamined publication: Japan 2010-060420 (Mar. 18, 2010)

  10. Sugiyama, M. & Nakajima, S.
    Position detection method, program, position detection system and exposure system.
    Unexamined publication: Japan 2010-040553 (Feb. 18, 2010)

  11. Kitagawa, K., Sugiyama, M., Ogawa, H., & Suzuki, K.
    A measurement method of surface shape with plural wavelentghs and an apparatus using the same method.
    Publication: China ZL200810005781.8 (Feb. 16, 2011)
    Publication: Japan 4885154 (Dec. 16, 2011)
    Unexamined publication: Korea 2008-71905 (Aug. 5, 2008)
    Unexamined publication: Taiwan 200839177 (Oct. 1, 2008)

  12. Ueki, K., Sugiyama, M., & Ihara, Y.
    Age estimation system, age estimation method, and program.
    Publication: Japan 4742193 (May 20, 2011)
    Publication: Korea 10-1299775 (Aug. 19, 2013)

  13. Ueki, K., Sugiyama, M., & Ihara, Y.
    Age estimation system, method, and program.
    Publication: Japan 4742192 (May 20, 2011)
    Publication: Korea 10-1342717 (Dec. 11, 2013)

  14. Sugiyama, M., Ogawa, H., Kitagawa, K., & Suzuki, K.
    Surface shape measuring method and device using the same.
    Publication: Japan 4710078 (Apr. 1, 2011)
    Unexamined publication: PCT WO 2007/088789 A1 (Aug. 9, 2007)
    Publication: Taiwan I405949 (Aug. 21, 2013)
    Publication: Korea 10-1257538 (Apr. 17, 2013)
    Publication: USA US-7852489-B2 (Dec. 14, 2010)

  15. Ogawa, H., Sugiyama, M., Shimoyama, K., & Kitagawa, K.
    Method and equipment for surface and/or film thickness profiling.
    Publication: Japan 4192038 (Sep. 26, 2008)

  16. Ogawa, H., Nakanowatari, H., Hayashi, M., Kitagawa, K., & Sugiyama, M.
    Method and equipment for surface and/or film thickness profiling of film-covered objects.
    Publication: Japan 4183089 (Sep. 12, 2008)

  17. Ogawa, H., Kitagawa, K., Sugiyama, M., & Shimoyama, K.
    Method and equipment for surface and/or film thickness profiling.
    Unexamined publication: Japan 2004-340680 (Dec. 2, 2004)


Theses

  1. Sugiyama, M.
    A theory of model selection and active learning for supervised learning.
    Doctor Thesis, Department of Computer Science, Tokyo Institute of Technology, Tokyo, Japan, Jan. 2001.
    (This thesis was selected for 2002 Tejima Doctor Dissertation Award)
    [ thesis, slides ]

  2. Sugiyama, M.
    Incremental active learning for optimal generalization in neural networks.
    Master Thesis, Department of Computer Science, Tokyo Institute of Technology, Tokyo, Japan, Feb. 1999.
    [ thesis ]


Masashi Sugiyama (sugi [at] cs.titech.ac.jp)

Sugiyama Laboratory, Department of Computer Science, Graduate School of Information Science and Engineering, Tokyo Institute of Technology,
2-12-1-W8-74, O-okayama, Meguro-ku, Tokyo, 152-8552, Japan.
TEL & FAX: +81-3-5734-2699