Материал: искусственный интеллект

Внимание! Если размещение файла нарушает Ваши авторские права, то обязательно сообщите нам

suai.ru/our-contacts

quantum machine learning

Representing Words in Vector Space and Beyond

113

118.Wang, M., & Manning, C. D. (2010). Probabilistic tree-edit models with structured latent variables for textual entailment and question answering. In Proceedings of the 23rd International Conference on Computational Linguistics (pp. 1164Ð1172). Stroudsburg: Association for Computational Linguistics.

119.Wang, M., Smith, N. A., & Mitamura, T. (2007). What is the jeopardy model? A quasisynchronous grammar for QA. In Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning (EMNLP-CoNLL).

120.Wang, Z., Hamza, W., & Florian, R. (2017). Bilateral multi-perspective matching for natural language sentences. Preprint. arXiv:1702.03814.

121.Wong, S. K. M., Ziarko, W., & Wong, P. C. N. (1985). Generalized vector spaces model in information retrieval. In Proceedings of the 8th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval - SIGIR ’85 (pp. 18Ð25). New York: ACM.

122.Yang, L., Ai, Q., Guo, J., & Croft, W. B. (2016). aNMM: Ranking short answer texts with attention-based neural matching model. In Proceedings of the 25th ACM International on Conference on Information and Knowledge Management (pp. 287Ð296). New York: ACM.

123.Yang, Y., Yih, W.-T., & Meek, C. (2015). WikiQA: A challenge dataset for open-domain question answering. In Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (pp. 2013Ð2018).

124.Yao, X., Van Durme, B., Callison-Burch, C., & Clark, P. (2013). Answer extraction as sequence tagging with tree edit distance. In Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (pp. 858Ð867).

125.Yao, Z., Sun, Y., Ding, W., Rao, N., & Xiong, H. (2018). Dynamic word embeddings for evolving semantic discovery. In Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining - WSDM ’18 (pp. 673Ð681). New York: ACM.

126.Yih, W. T., Chang, M. W., Meek, C., & Pastusiak, A. (2013). Question answering using enhanced lexical semantic models. In Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) (vol. 1, pp. 1744Ð1753).

127.Yin, W., & SchŸtze, H. (2015). Learning meta-embeddings by using ensembles of embedding sets. Preprint. arXiv:1508.04257.

128.Yu, L., Hermann, K. M., Blunsom, P., & Pulman, S. (2014). Deep learning for answer sentence selection. Preprint. arXiv:1412.1632.

129.Zamani, H., & Croft, W. B. (2017). Relevance-based word embedding. In Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval - SIGIR’17 (pp. 505Ð514). New York: ACM.

130.Zhai, C., & Lafferty, J. (2004). A study of smoothing methods for language models applied to information retrieval. Transactions on Information Systems, 22(2), 179Ð214.

131.Zhang, P., Niu, J., Su, Z., Wang, B., Ma, L., & Song, D. (2018). End-to-end quantumlike language models with application to question answering. In The Thirty-Second AAAI Conference on Artificial Intelligence. Menlo Park: Association for the Advancement of ArtiÞcial Intelligence.

132.Zhang, X., Zhao, J., & LeCun, Y. (2015). Character-level convolutional networks for text classiÞcation. In Advances in neural information processing systems (pp. 649Ð657). Cambridge: MIT Press.

suai.ru/our-contacts

quantum machine learning

Quantum-Based Modelling of Database

States

Ingo Schmitt, Günther Wirsching, and Matthias Wolff

Abstract Database design of real-world scenarios requires complex data structures in order to adequately model complex real-world objects. Complex data structures can be constructed by a recursive use of elementary data types and data type constructors. The mathematics behind quantum mechanics provides us an interesting theory combining concepts from linear algebra, probability calculus, and logic. In order to make the mathematics of quantum mechanics available for database structures and states we develop a mapping of concepts from type theory of databases to the mathematics of quantum mechanics.

1 Introduction

The mathematics behind quantum mechanics [1] provides us a formalism that combines very elegantly concepts from probability calculus, linear algebra, and logic. The semantics of a quantum system is expressed by a normalized ket vector in an inner product space. Here we show how to model complex data structures of a database state as a normalized ket vector of an inner product space, see also [2]. Furthermore, we show how to read a database vector by use of the statistics of quantum measurement. Our database mapping to the mathematics of quantum mechanics proposed in the following is restricted to Þnite dimensional and real inner product spaces. For a query language based on our mapping and quantum logic as well as quantum measurement, we refer to [3].

Please note that we do not propose to perform a mapping onto a physical quantum computer. Instead, the proposed mapping is on a conceptual level rather

I. Schmitt ( ) á M. Wolff

Brandenburg University of Technology Cottbus-Senftenberg, Cottbus, Germany e-mail: schmitt@b-tu.de

G. Wirsching

Catholic University of EichstŠtt-Ingolstadt, EichstŠtt, Germany e-mail: gunther.wirsching@ku.de

© Springer Nature Switzerland AG 2019

115

D. Aerts et al. (eds.), Quantum-Like Models for Information Retrieval and Decision-Making, STEAM-H: Science, Technology, Engineering, Agriculture, Mathematics & Health, https://doi.org/10.1007/978-3-030-25913-6_6

suai.ru/our-contacts

quantum machine learning

116

I. Schmitt et al.

than on an implementation level. The beneÞt from doing so is to bridge wellknown database modelling concepts into another formalism. This is very promising because the rich theory of linear algebra and quantum logic [4] provides us powerful concepts and gives us a deep understanding of certain database problems. For example, the relation between entanglement and functional dependencies between database values, and reasoning from databases based on quantum logic are not well understood so far. On a conceptual level we are able to develop and to prove interesting new theorems.

2 Motivating Example: Car Dealership

In the following we develop an example for demonstrating a quantum-based data modelling of a database structure and the measurement of its state. As example we use the managed data objects of a car dealership. From the view of a database designer, cars are complex-structured objects. Every car is composed of different technical components as shown in Fig. 1. Furthermore, a service booklet containing a record of car inspections exists for every car.

Some properties of components of the car dealership are listed in Table 1 deÞning the state of a car. Furthermore, some atomic conditions for measurements based on these properties are given in Table 2.

Fig. 1 Components of the

car dealership

 

car management

 

 

 

 

 

 

 

car

 

car

 

 

car body

engine

chassis service booklet

 

 

 

service entry

service entry

Table 1 Properties of car components

 

 

 

 

 

 

 

 

 

 

Component

Property

 

 

Value domain

 

 

 

 

 

 

Car

license tag

 

 

Set of valid license tags

Car

year of construction

 

2000Ð2020

 

Engine

number of cylinders

 

2Ð16

 

Engine

cylinder arrangement

 

Row, v-form, Boxer-form

Engine

fuel tank (l)

 

 

30Ð80

 

Car body

kilometre (km)

 

 

0Ð300.000

 

Car body

shipping volume (l)

 

200Ð500

 

Service entry

date

 

 

01.01.2000 to 31.12.2030

Service entry

kilometre (km)

 

 

0Ð300.000

 

suai.ru/our-contacts

quantum machine learning

Quantum-Based Modelling of Database States

Table 2 Atomic conditions on car properties

 

117

 

 

Label

Condition

YC1

year of construction = 2016

YC2

year of construction = 2017

FT1

fuel tank 35

FT2

fuel tank is very large

K1

kilometre 15.000

K2

kilometre is very small

NC

number of cylinders = 4

CA1

cylinder arrangement = Row

CA2

cylinder arrangement = Boxer

When we look at condition FT2 we make the following observation: testing FT2 against the state of a car object cannot adequately return yes or no. Instead, we expect to receive a grade of compliance from the interval [0, 1]. A high value signals a strong compliance and vice versa. Later on, we will show how the statistics of quantum measurements provides us a mean to compute the required gradual values.

At Þrst, we discuss how to model elementary data types by using the mathematics behind quantum mechanics. Here we focus on Þnite dimensional and real inner product spaces. Later on, we will explain how to construct complex data types and how to map them into the quantum world.

3 Modelling Elementary Data Types

An elementary data type deÞnes a data structure and operations to deal with its values. A data type is elementary if its values cannot be meaningfully decomposed into smaller semantic values. In our example the property year of construction is elementary. Its domain covers all possible year values of car construction. A useful operation could be the computation of the difference between 2 year values. We deÞne the function dom which assigns to a data type a set of valid values. That set is often called domain of a data type.

We distinguish between two types of elementary data types:

Ðorthogonal data type: The values of that data type are independent from each other. There is no meaningful similarity between them. Two values are either identical or not identical. In our example, the property cylinder arrangement is orthogonal.

Ðnon-orthogonal data type: Besides the test on identity between two values gradual similarity values can be required between them. In our example the property fuel tank is non-orthogonal: a required volume of 35 L is more similar to a given value of 40 L than 45 L.

The distinction between orthogonal and non-orthogonal often depends on the intended application semantics. In some application it may be important to demand

suai.ru/our-contacts

quantum machine learning

118

I. Schmitt et al.

for an exact value of 35 L for a fuel tank and every deviation is seen as wrong. In that case, fuel tank would be modelled as an orthogonal data type. For simplicity, in the following we assume that every property is categorized either as orthogonal or non-orthogonal.

In next subsections we show how to map an elementary data type dt with a Þnite domain

Dom ( dt ) := {V1, . . . , Vk }

to a family of ket vectors of an inner product space. The mapping of a value to a ket vector is denoted by the symbol . Function QDom assigns to a data type the set of ket vectors which appear as possible outcome of this mapping.

3.1 Orthogonal Data Types

The values of an orthogonal data type dt are bijectively mapped to ket vectors forming an orthonormal basis of an inner product space:

QDom( dt ) = {|V1 , . . . , |Vk }

Dom( dt ) QDom( dt )

i [1, k] : Vi → |Vi .

The corresponding ket vectors are taken to be mutually orthogonal; they span a k- dimensional inner product space.

Let us take a basis ket vector |Vx for a value of an orthogonal property. If we want to test the value Vi for identity with Vx , we proceed in a way reßecting quantum measurement. We construct the projector P = |Vi Vi |:

Vx

P

Vx

Vx

Vi

 

Vi

Vx

=

1 if i = x

|

|

 

= |

 

|

 

0 otherwise.

For testing whether a value x is contained in a value set S = {s} we use the projector

P = s S |Vs Vs |:

Vx |P |Vx = Vx |

 

|Vs Vs | |Vx

 

 

s S

 

 

 

 

=

 

Vx

Vs

Vs

Vx

=

1 if x S

|

 

|

 

0 otherwise.

 

s S