Search Results - "Sakai, Yasufumi"

  • Showing 1 - 17 results of 17
Refine Results
  1. 1
  2. 2

    Bioaccumulation and Toxic Potencies of Polychlorinated Biphenyls and Polycyclic Aromatic Hydrocarbons in Tidal Flat and Coastal Ecosystems of the Ariake Sea, Japan by Nakata, Haruhiko, Sakai, Yasufumi, Miyawaki, Takashi, Takemura, Akira

    Published in Environmental science & technology (15-08-2003)
    “…Sediment and marine biota comprising several species of tidal flat and coastal organisms were analyzed for polychlorinated biphenyls (PCBs) including non- and…”
    Get full text
    Journal Article
  3. 3

    S-DFP: shifted dynamic fixed point for quantized deep neural network training by Sakai, Yasufumi, Tamiya, Yutaka

    Published in Neural computing & applications (17-12-2021)
    “…Recent advances in deep neural networks have achieved higher accuracy with more complex models. Nevertheless, they require much longer training time. To reduce…”
    Get full text
    Journal Article
  4. 4
  5. 5

    Automatic Pruning Rate Derivation for Structured Pruning of Deep Neural Networks by Sakai, Yasufumi, Iwakawa, Akinori, Tabaru, Tsuguchika, Inoue, Atsuki, Kawaguchi, Hiroshi

    “…To compress the neural network model, structured pruning has been proposed. However, finding a proper pruning rate to suppress the accuracy degradation of…”
    Get full text
    Conference Proceeding
  6. 6
  7. 7

    Dropout and DropConnect for Reliable Neuromorphic Inference Under Communication Constraints in Network Connectivity by Sakai, Yasufumi, Pedroni, Bruno U., Joshi, Siddharth, Tanabe, Satoshi, Akinin, Abraham, Cauwenberghs, Gert

    “…Dropout and DropConnect are known as effective methods to improve on the generalization performance of neural networks, by either dropping states of neural…”
    Get full text
    Journal Article
  8. 8

    Quantizaiton for Deep Neural Network Training with 8-bit Dynamic Fixed Point by Sakai, Yasufumi

    “…Recent advances in deep neural networks have achieved higher accuracy with more complex models. Nevertheless, they require much longer training time. To reduce…”
    Get full text
    Conference Proceeding
  9. 9

    Envelope tracking CMOS power amplifier with high-speed CMOS envelope amplifier for mobile handsets by Yoshida, Eiji, Sakai, Yasufumi, Oishi, Kazuaki, Yamazaki, Hiroshi, Mori, Toshihiko, Yamaura, Shinji, Suto, Kazuo, Tanaka, Tetsu

    Published in Japanese Journal of Applied Physics (01-04-2014)
    “…A high-efficiency CMOS power amplifier (PA) based on envelope tracking (ET) has been reported for a wideband code division multiple access (W-CDMA) and long…”
    Get full text
    Journal Article
  10. 10

    Active Learning for Graph Neural Networks Training in Catalyst Energy Prediction by Sakai, Yasufumi, Matsumura, Naoki, Inoue, Atsuki, Kawaguchi, Hiroshi, Thang, Dang, Ishikawa, Atsushi, Hoskuldsson, Arni Bjorn, Skulason, Egill

    “…The electrification of major industrial processes constitutes an important step in reducing global carbon emissions. Thus, the identification of materials able…”
    Get full text
    Conference Proceeding
  11. 11
  12. 12

    GPQ: Greedy Partial Quantization of Convolutional Neural Networks Inspired by Submodular Optimization by Tsuji, Satoki, Yamada, Fuyuka, Kawaguchi, Hiroshi, Inoue, Atsuki, Sakai, Yasufumi

    “…Recent work has revealed that the effects of neural network quantization on inference accuracy are different for each layer. Therefore, partial quantization…”
    Get full text
    Conference Proceeding
  13. 13

    Efficient and Large Scale Pre-training Techniques for Japanese Natural Language Processing by Kasagi, Akihiko, Asaoka, Masahiro, Tabuchi, Akihiro, Oyama, Yosuke, Honda, Takumi, Sakai, Yasufumi, Dang, Thang, Tabaru, Tsuguchika

    “…Pre-training in natural language processing greatly affects the accuracy of downstream tasks. However, pre-training is a bottleneck in the AI system…”
    Get full text
    Conference Proceeding
  14. 14
  15. 15

    DropOut and DropConnect for Reliable Neuromorphic Inference under Energy and Bandwidth Constraints in Network Connectivity by Sakai, Yasufumi, Pedroni, Bruno U., Joshi, Siddharth, Akinin, Abraham, Cauwenberghs, Gert

    “…DropOut and DropConnect are known as effective methods to improve on the generalization performance of neural networks, by either dropping states of neural…”
    Get full text
    Conference Proceeding
  16. 16
  17. 17