In today’s networked society, information security has inevitably become a vitally important aspect, the lack of which could pose serious threats to the prospering of our civilization. Cryptography is the foundation of information security. Unfortunately, today’s cryptography---especially one of its key components, the public-key cryptography (PKC)---is being threatened by the emergent quantum computers. So far, the range of problems that quantum computers enable us to solve is fairly narrow, and there is a consensus that it may well stay that way. However, when it comes to PKC, quantum computers present a momentous threat because they can efficiently solve integer factorization and discrete logarithm problems. This means that quantum computers of thousand qubits will instantly break today's most popular PKCs including RSA (Rivest-Shamir-Adleman), DSA (Digital Signature Algorithm), and ECC (elliptic-curve cryptography). Post-quantum cryptography (PQC) is the study of PKCs that can resist the attack of the emergent thousand-qubit quantum computers. The most promising candidates of PQCs include code-based, hash-based, lattice-based, and multivariate PKCs. We have been working on the development and implementation of PQCs on special hardwares such as FPGA. Currently, programming such hardwares relies on the use of general-purpose hardware-description languages such as Verilog or VHDL. These general-purpose programming tools are not suitable for developing cryptographic or cryptanalytic systems, not only because these systems are usually highly complicated, but also because the security of a system depends on that of the weakest link, and hence if there are any mistakes in the development process of any components, the security of the entire system could be at stake. Because of this, we have been developing system-level design tools that are more suitable for developing cryptographic and information-security applications. These tools include new programming languages and optimizing compilers that make cryptographic development easier and less error-prone.
A smart contract running on a blockchain such as Ethereum enables credible transactions without a third party trusted by the contract-entering parties, as it can self-execute and self-enforce without the need for external administration or arbitration. Thus, some proponents argue that the integration of Web 2.0 with smart contracts will bring us Web 3.0, upon which business processes will enact themselves autonomously and securely. However, there is a major missing piece: the lack of privacy in smart contracts, which will prevent business processes with trade secrets from migrating to Web 3.0. Take a typical application scenario of smart metering as an example: ideally, we would like to record the sensor readings from a rental car onto a blockchain so that later on, the insurance company can accurately determine how much the driver's responsibility should be in a car accident. However, this would pose a serious threat to privacy, as a lot of information about the driver's trips can be deduced from these sensor readings. To solve this challenge, we have been working on a general system that allows on-chain smart contracts to handle private data stored off-chain without compromising the privacy of the data using cryptographic commitment algorithms and zero-knowledge proof systems. Such a system would provide an ideal solution to the aforementioned example and the like, enabling an autonomous, decentralized Web 3.0 without compromising user privacy at all.
With the rapid growth of IoT(Internet of Things), i.e. not only smartphone but also various devices are used to connect all kinds of things to Internet, a large amount of data will be collected and utilized in our daily life. How to positively utilize such "Big Data" and/or "Open Data" is quite an important and hot topic.
Big Data may inlude personal information and it is required to assure anonymity in collecting, analysing and utilizeing these kinds of data. According to "Amended Act on the Protection of Personal Information" which passed Japanese Diet in September 2015, the notion "anonymously processed information" was introduced with the aim to promote utilization of big data activety and to ensure security and privacy of data.
We have been doing research on anonymization methods, including k-anonymization and differential privacy, and Privacy-Preserving Data Mining (PPDM). For example, we have studied the relation of anonymity evaluation indecies and proposed an data processing method maintaining anonymity and data usefulness.
Moreover, we are doing researches on privacy risk on large scale data: "Web-based Attack Response with Practical and Deployable Research Intiative (WarpDrive)".
Currently, password and ID card are major user authentication methods. Unfortunately, these methods requires a lot of effort for memorizing password(s) or managing ID cards. To reduce these tasks, the following methods have been attracting much attention:
Various behaviors such as hand script and standing/sitting can be used as behavioral authentication. Since authentication using a single characteristic behaivor may not be able to achieve sufficient accuracy, it is expected that multiple characteristic behaviors are combined to improve the accuracy, which is known as composite authentication, or that an additional authentication are demanded upon the detection of an abnormal behavior in the authentication, which is known as risk-based authentication. We have been working for behavior authentication using wireless LAN information, which is related to location information of user. Each user who carries devices such as smartphone receives various wireless LAN radio waves. We analyze patterns exisiting in the received data and apply them to the user authentication.
By the deployment of IoT devices, it is expected that such devices are under the threats of various cyber-attacks.
There are quite different kinds of platforms for each IoT device depending upon its application such as sensing. Some of the IoT devices have only limited computational power. We should achieve secure communication even for such IoT devices, which requires light-weight and low-power encryption process.
On the other hand, it has been becoming popular to utilize cloud computing for the control of IoT devices. This is partly because the performance of the devices is low and/or the information collected by the devices should be analyzed to control them. Since clouds usually stay far away from users, it may cause communication delay. Edge computing avoids such a delay by putting the functionality of cloud at the edge of users. It also avoids unnecessary data transmission to the cloud by selecting which data should be processed at the edge and at the clould.
We are doing researches on "Implementation of lightweight cipher for IoT devices" and "Edge computing systems."
With today's rapid development of computer technology, people are anticipating the advent of quantum computers. In 2016, IBM announced 5-bit quantum computer "The IBM Quantum Experience", which enables us to perform quantum computation via the cloud.
We can realize very large-scale parallel computation by applying quantum superimposing, one of the feature of quantum computer. On the other hand, it is pointed out that quantum computers can break some number theoretic problems such as the RSA problem.
Our Laboratory is doing researches on cryptographic systems that are resistant to quantum computation (Post-Quantum Cryptography).