Search Results - "Kim, Jounghee"

  • Showing 1 - 5 results of 5
Refine Results
  1. 1

    Explainable anomaly detection framework for predictive maintenance in manufacturing systems by Choi, Heejeong, Kim, Donghwa, Kim, Jounghee, Kim, Jina, Kang, Pilsung

    Published in Applied soft computing (01-08-2022)
    “…To conduct preemptive essential maintenance, predictive maintenance detects the risk of unexpected shutdowns in a manufacturing system, thereby ensuring…”
    Get full text
    Journal Article
  2. 2

    Cu microvia filling by pulse-reverse electrodeposition with a single accelerator by Seo, Huiju, Kim, Jounghee, Kang, Jungkyu, Park, Jong-Eun, Kim, Myung Jun, Kim, Jae Jeong

    Published in Electrochimica acta (20-06-2024)
    “…•Single-accelerator microvia filling was achieved by pulse-reverse electrodeposition.•The addition of Cl− was critical for single-accelerator microvia…”
    Get full text
    Journal Article
  3. 3

    K-Wav2vec 2.0: Automatic Speech Recognition based on Joint Decoding of Graphemes and Syllables by Kim, Jounghee, Kang, Pilsung

    Published 11-10-2021
    “…Wav2vec 2.0 is an end-to-end framework of self-supervised learning for speech representation that is successful in automatic speech recognition (ASR), but most…”
    Get full text
    Journal Article
  4. 4

    Drug utilization review of mupirocin ointment in a Korean university-affiliated hospital by Youn, Sung Hee, Lee, Seung Soon, Kim, Sukyeon, Lee, Jeong-A, Kim, Bum Joon, Kim, Jounghee, Han, Hye-Kyung, Kim, Jae-Seok

    Published in The Korean journal of internal medicine (01-07-2015)
    “…Intranasal mupirocin and chlorhexidine bathing are candidate strategies to prevent healthcare-associated infections caused by methicillin-resistant…”
    Get full text
    Journal Article
  5. 5

    Back-Translated Task Adaptive Pretraining: Improving Accuracy and Robustness on Text Classification by Lee, Junghoon, Kim, Jounghee, Kang, Pilsung

    Published 22-07-2021
    “…Language models (LMs) pretrained on a large text corpus and fine-tuned on a downstream text corpus and fine-tuned on a downstream task becomes a de facto…”
    Get full text
    Journal Article