OMRON

OMRON SINIC X Corporation | Global

OMRON SINIC X to Present Latest Research Findings at ICLR 2025, Top-tier Conference in the Field of Machine Learning

OMRON SINIC X Corporation (HQ: Bunkyo-ku, Tokyo; President and CEO: Masaki Suwa; hereinafter “OSX”) will present the latest research findings at the Thirteenth International Conference on Learning Representations (ICLR 2025)(hereinafter “ICLR 2025”).

ICLR 2025 is one of the top-tier international conferences in the field of machine learning, particularly known for showcasing cutting-edge research in deep learning and representation learning. The conference will be held from April 24 to April 28, 2025, in Singapore (local time).

The two research papers to be presented by OSX are as follows.

ICLR 2025 presentations

 Rethinking the role of frames for SE(3)-invariant crystal structure modeling

Yusei Ito (OSX Intern/Osaka University), Tatsunori Taniai (OSX), Ryo Igarashi (OSX), Yoshitaka Ushiku (OSX), Kanta Ono (Osaka University)
 
Our daily lives benefit from a wide range of material devices, such as magnets and semiconductors. In recent years, the development of next-generation materials—such as high-temperature superconductors and high-performance battery materials—has been actively pursued. While materials development traditionally involves extensive trial and error over long periods, recent advances in AI technology offer promising ways to accelerate this process.
In this study, we developed a transformer-based neural network called CrystalFramer, which predicts material properties from crystal structures—the “blueprints” of materials. By extending an existing technique known as frame, we propose a new concept called dynamic frame. We demonstrate its effectiveness in enhancing the expressive power of our previously introduced model, Crystalformer, enabling it to capture more detailed three-dimensional information of crystal structures.

https://openreview.net/forum?id=gzxDjnvBDa
https://omron-sinicx.github.io/crystalframer/

  

 Near-Optimal Policy Identification in Robust Constrained Markov Decision Processes via Epigraph Form

Toshinori Kitamura (OSX Intern /The University of Tokyo), Tadashi Kozuno (OSX), Wataru Kumagai (OSX), Kenta Hoshino (Kyoto University), Yohei Hosoe (Kyoto University), Kazumi Kasaura (OSX), Masashi Hamaya (OSX), Paavo Parmas (The University of Tokyo), Yutaka Matsuo (The University of Tokyo)

When applying reinforcement learning to practical applications, there are two critical requirements for algorithms. The first requirement is robustness against discrepancies between training data and real-world deployments. The second is theoretically-grounded safety. However, no existing algorithm satisfies both requirements simultaneously. In this study, we propose a method called epigraph form reinforcement learning and theoretically demonstrate its ability to output robust and safe policies against the aforementioned discrepancies, and confirm this experimentally.

https://openreview.net/forum?id=G5sPv4KSjR

 

We will also present the following research at the AI for Accelerated Materials Design (AI4Mat) Workshop, which will be held in conjunction with ICLR 2025.

 Transformer as a Neural Knowledge Graph

Yuki Nishihori (Osaka University), Yusei Ito (Osaka University), Yuta Suzuki (TOYOTA), Ryo Igarashi (OSX), Yoshitaka Ushiku (OSX), Kanta Ono (Osaka University)

In this study, we propose a Neural Knowledge Graph (NKG) to address the challenge of limited linguistic data in CLaSP (Contrastive Language–Structure Pre-training), a contrastive learning method between crystal structures and language models that we presented at the recent NeurIPS AI4Mat workshop. NKG utilizes a Transformer to dynamically incorporate related knowledge in addition to the keywords from the paper. In experiments, NKG demonstrated higher performance in keyword-based crystal structure retrieval compared to the conventional method.

 

※Author information is current as of the date of writing or submission. Please be advised that the information may become outdated after that point.


 

For any inquiries about OSX, please contact us here.

share
home
Page
Top