Search Results - "Hagiwara, Katsuyuki"

Refine Results
  1. 1

    Prediction-accuracy improvement of neural network to ferromagnetic multilayers by Gaussian data augmentation and ensemble learning by Nawa, Kenji, Hagiwara, Katsuyuki, Nakamura, Kohji

    Published in Computational materials science (25-02-2023)
    “…In materials informatics using machine learning and density functional theory (DFT) calculations, it is often hard to obtain enough database due to extremely…”
    Get full text
    Journal Article
  2. 2
  3. 3

    Relation between weight size and degree of over-fitting in neural network regression by Hagiwara, Katsuyuki, Fukumizu, Kenji

    Published in Neural networks (2008)
    “…This paper investigates the relation between over-fitting and weight size in neural network regression. The over-fitting of a network to Gaussian noise is…”
    Get full text
    Journal Article
  4. 4

    Investigation of an efficient method of oocyte retrieval by dual stimulation for patients with cancer by Takeuchi, Hiroki, Maezawa, Tadashi, Hagiwara, Katsuyuki, Horage, Yuki, Hanada, Tetsuro, Haipeng, Huang, Sakamoto, Mito, Nishioka, Mikiko, Takayama, Erina, Terada, Kento, Kondo, Eiji, Takai, Yasushi, Suzuki, Nao, Ikeda, Tomoaki

    Published in Reproductive medicine and biology (01-01-2023)
    “…Purpose To examine the optimal timing of second ovarian stimulation using the dual stimulation method for good ovarian responders with cancer undergoing oocyte…”
    Get full text
    Journal Article
  5. 5

    On scaling of soft-thresholding estimator by Hagiwara, Katsuyuki

    Published in Neurocomputing (Amsterdam) (19-06-2016)
    “…LASSO is known to have a problem of excessive shrinkage at a sparse representation. To analyze this problem in detail, in this paper, we consider a positive…”
    Get full text
    Journal Article
  6. 6

    Upper bound of the expected training error of neural network regression for a Gaussian noise sequence by Hagiwara, Katsuyuki, Hayasaka, Taichi, Toda, Naohiro, Usui, Shiro, Kuno, Kazuhiro

    Published in Neural networks (01-12-2001)
    “…In neural network regression problems, often referred to as additive noise models, NIC (Network Information Criterion) has been proposed as a general model…”
    Get full text
    Journal Article
  7. 7

    A semi-supervised learning using over-parameterized regression by Hagiwara, Katsuyuki

    Published 05-09-2024
    “…Semi-supervised learning (SSL) is an important theme in machine learning, in which we have a few labeled samples and many unlabeled samples. In this paper, for…”
    Get full text
    Journal Article
  8. 8

    On gradient descent training under data augmentation with on-line noisy copies by Hagiwara, Katsuyuki

    Published 08-06-2022
    “…In machine learning, data augmentation (DA) is a technique for improving the generalization performance. In this paper, we mainly considered gradient descent…”
    Get full text
    Journal Article
  9. 9

    Bridging between soft and hard thresholding by scaling by Hagiwara, Katsuyuki

    Published 02-02-2022
    “…In this article, we developed and analyzed a thresholding method in which soft thresholding estimators are independently expanded by empirical scaling values…”
    Get full text
    Journal Article
  10. 10

    Radical Density Measurement at Low-Pressure Discharge Denitrification by Appearance Mass Spectrometry by Ito, Kohei, Hagiwara, Katsuyuki, Nakaura, Hiroyuki, Tanaka, Hidekazu, Onda, Kazuo

    Published in Japanese Journal of Applied Physics (01-03-2001)
    “…In discharge denitrification, radical production by electron collision with combustion gas is a key process which determines the denitrification process and…”
    Get full text
    Journal Article
  11. 11
  12. 12

    On an improvement of LASSO by scaling by Hagiwara, Katsuyuki

    Published 22-08-2018
    “…A sparse modeling is a major topic in machine learning and statistics. LASSO (Least Absolute Shrinkage and Selection Operator) is a popular sparse modeling…”
    Get full text
    Journal Article
  13. 13

    A consistent model selection for orthogonal regression under component-wise shrinkage by Hagiwara, Katsuyuki

    “…Several authors developed a series of model selection criteria for determining the major frequency components in harmonic analysis. In this paper, we…”
    Get full text
    Journal Article
  14. 14

    Adaptive scaling for soft-thresholding estimator by Hagiwara, Katsuyuki

    Published 29-01-2016
    “…Soft-thresholding is a sparse modeling method that is typically applied to wavelet denoising in statistical signal processing and analysis. It has a single…”
    Get full text
    Journal Article
  15. 15

    Regularization learning, early stopping and biased estimator by Hagiwara, Katsuyuki

    Published in Neurocomputing (Amsterdam) (2002)
    “…In this article, we present a unified statistical interpretation of regularization learning and early stopping for linear networks in the context of…”
    Get full text
    Journal Article
  16. 16

    On a training scheme based on orthogonalization and thresholding for a nonparametric regression problem by Hagiwara, Katsuyuki

    “…For a nonparametric regression problem, we have been proposed a training scheme based on orthogonalization and thresholding, in which a machine is assumed to…”
    Get full text
    Conference Proceeding
  17. 17
  18. 18
  19. 19

    On the problem of applying AIC to determine the structure of a layered feed-forward neural network by Hagiwara, Katsuyuki, Toda, Naohiro, Usui, Shiro

    “…AIC (Akaike's Information Criterion) has been thought to be effective to determine an optimal structure of layered feed-forward neural networks. However, it…”
    Get full text
    Journal Article
  20. 20

    Regularization learning and early stopping in linear networks by Hagiwara, K., Kuno, K.

    “…Generally, learning is performed so as to minimize the sum of squared errors between network outputs and training data. Unfortunately, this procedure does not…”
    Get full text
    Conference Proceeding