Home
About
Research
Publications
Education
Awards
Links
Contact
Japanese
About
School of Computing, Tokyo Institute of Technology
Research
Deep Learning
Second-order Optimization
Large-scale Distributed Parallelization
Publications (Reviewed)
Reviewed
Satoki Ishikawa, Ryo Karakida. On the Parameterization of Second-Order Optimization Effective Towards the Infinite Width. ICLR2024
Satoki Ishikawa, Rio Yokota. When Does Second-Order Optimization Speed Up Training?. ICLR2024 TinyPaper Track
Kazuki Osawa, Satoki Ishikawa, Rio Yokota, Shigang Li, Torsten Hoefler. ASDL: A Unified Interface for Gradient Preconditioning in PyTorch. HOOML2022.
Preprints
Satoki Ishikawa, Rio Yokota, Satoki Ishikawa. Local Loss Optimization in the Infinite Width: Stable Parameterization of Predictive Coding Networks and Target Propagation
Satoki Ishikawa, Makoto Yamada, Han Bao, Yuki Takezawa. PhiNets: Brain-inspired Non-contrastive Learning Based on Temporal Prediction Hypothesis
Domestic Conferences
Satoki Ishikawa, Rio Yokota. "Study on Gradient Preconditioning Methods in Deep Learning." IPSJ 85th Annual Conference 2023 [Student Encouragement Award]
Education
2023.04 - Current
Master's Program, School of Computing, Tokyo Institute of Technology
2019.04 - 2023.03
Bachelor's Program, School of Computing, Tokyo Institute of Technology
Graduation Representative
Awards
Student Encouragement Award, IPSJ 85th Annual Conference
Excellent Student Award, School of Computing, Tokyo Institute of Technology