Материал: искусственный интеллект

Внимание! Если размещение файла нарушает Ваши авторские права, то обязательно сообщите нам

www.nature.com/scientificreports/

 

 

 

www.nature.co /scientificr

ports

suai.ru/our-contacts

 

 

 

quantum machine learning

p(R(t2) = l) =

 

 

 

Ml T(t2 t1) T(t1) ϕ(0)

 

 

 

 

 

 

1 .

 

 

 

 

 

 

 

 

(1)

 

 

 

 

 

 

and for the quantum model, the probabililty of reporting rating l at time t2 equals

 

p(R(t2) = l) =

 

 

 

 

 

Ml U(t2 t1) U(t1) ψ(0)

 

 

 

2 .

 

 

 

 

 

 

 

 

(2)

 

 

 

 

 

 

For the Markov model, the joint probability of choosing category k at time t1 and then choosing category l at

time t2 equals

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

p(R(t1) = k, R(t2) = l)= Ml T(t2 t1) Mk T(t1) ϕ(0)

 

 

 

 

 

 

1 .

(3)

 

 

 

 

For the quantum model, the joint probability of choosing category k at time t1 and then choosing category l

at time t2 equals

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

p(R(t1) = k, R(t2) = l)= Ml U(t2 t1) Mk U(t1) ψ(0)

 

 

 

2 .

(4)

 

 

 

 

As can be seen in Eqs. 3 and 4, both models include a “collapse” on the choice at time t1. But this turns out to have no e ect for the Markov model. To test interference, we sum Eqs. 3 and 4 across k and compare these sums with Eqs. 1 and 2 respectively.

e Markov model required tting two parameters: a “dri ” rate parameter μ = α +α β and a di usion rate

parameter γ = (α + β). e quantum model required tting two parameters: a “dri ” rate parameter μ, and a “di usion” parameter σ. e parameter μ must be real, but σ can be complex. However, to reduce the number of parameters, we forced σ to be real. e model tting procedure for both the Markov and the quantum models entailed estimating the two parameters from conditions 1 and 2 separately for each participant and each coherence level using maximum likelihood.

e Markov-V model used an approximately normal distribution of “dri ” rate parameters. is model required estimating three parameters: μrepresenting the mean of the distribution of dri rates, σrepresenting the

standard deviation of the dri rates, and υ representing the di usion rate. ese were also estimated using maximum likelihood. e predictions for the Markov-V model were then obtained from the expectation

p(R(t1) = k, R(t2) = l) = p(μ) p[R(t1) = k, R(t2) = l|μ]

(5)

μ

where p(μ) is a discrete approximation to the normal distribution.

A master equation can be formed by combining Schrödinger and Lindblad evolution operators. e master equation operates on a state defined by a density matrix, ρ(t), which is formed by the outer product

ρ(t) = ψ(t) ψ(t). A coherent quantum state has a density matrix containing o diagonal terms; a classical state has a density matrix with only diagonal terms. e system guided by the master equation initially starts out in a coherent quantum state, but then decoheres toward a classical state.

Data availability

e datasets and computer programs used in the current study are available at http://mypage.iu.edu/jbusemey/ quantum/DynModel/DynModel.htm for models and https://osf.io/462jf/ for data.

Received: 10 February 2019; Accepted: 14 November 2019;

References

\1.\ Link, S. W. e relative judgment theory of two choice response time. Journal of Mathematical Psychology 12, 114–135 (1975).

\2.\ Ratcli , R., Smith, P. L., Brown, S. L. & McCoon, G. Di usion decision model: Current history and issues. Trends in Cognitive Science

20, 260–281 (2016).

\3.\ Shadlen, M. N. & Kiani, R. Decision making as a window on congnition. Neuron 80, 791–806 (2013).

\4.\ Bogacz, R., Brown, E., Moehlis, J., Holmes, P. & Cohen, J. D. e physics of optimal decision making: a formal analysis of models of performance in two-alternative forced-choice tasks. Psychological review 113, 700 (2006).

\5.\ Pleskac, T. J. & Busemeyer, J. R. Two-stage dynamic signal detection: A theory of choice, decision time, and con dence. Psychologicaleview 117(3), 864–901 (2010).

\6.\ Yu, S., Pleskac, T. J. & Zeigenfuse, M. D. Dynamics of postdecisional processing of con dence. Journal of Experimental Psychology: General 144, 489–510, https://doi.org/10.1037/xge0000062 (2015).

\7.\ Kvam, P. D. & Pleskac, T. J. Strength and weight: e determinants of choice and con dence. Cognition 152, 170–180, https://doi. org/10.1016/j.cognition.2016.04.008 (2016).

\8.\ Busemeyer, J. R., Wang, Z. & Townsend, J. Quantum dynamics of human decision making. Journal of Mathematical Psychology 50, 220–241 (2006).

\9.\ Busemeyer, J. R. & Bruza, P. D. Quantum models of cognition and decision (Cambirdge University Press, 2012).

\10.\ Martnez-Martnez, I. & Sánchez-Burillo, E. Quantum stochastic walks on networks for decision-making. Scientifc reports 6 (2016).

\11.\ Kvam, P. D., Pleskac, T. J., Yu, S. & Busemeyer, J. R. Interference e ects of choice on con dence., 112 (34). Proceedings of the National

Academy of Science 112(34), 10645–10650 (2015).

\12.\ Busemeyer, J. R., Fakhari, P. & Kvam, P. Neural implementation of operations used in quantum cognition. Progress in biophysics and molecular biology 130, 53–60 (2017).

\13.\ Ball, K. & Sekuler, R. Direction-speci c improvement in motion discrimination. Vision esearch 27, 953–965 (1987). \14.\ Bhattacharya, R. N. & Waymire, E. C. Stochastic processes with applications (Wiley, 1990).

\15.\ Busemeyer, J. R. & Wang, Y.-M. Model comparisons and model selections based on generalization criterion methodology. Journal of Mathematical Psychology 44, 171–189 (2000).

Scientific Reports | (2019) 9:18025 | https://doi.org/10.1038/s41598-019-54383-9

9

www.nature.com/scientificreports/

www.nature.co /scientificr ports

suai.ru/our-contacts

quantum machine learning

\16.\ Accardi, L., Khrennikov, A. Y. & Ohya, M. Quantum markov model for data from shafir-tversky experiments in cognitive psychology. Open Systems and Information Dynamics 16(4), 371–385 (2009).

\17.\ Khrennikova, P. Quantum dynamical modeling of competition and cooperation between political parties: the coalition and noncoalition equilibrium model. Journal of Mathematical Psychology 71, 39–50 (2016).

\18.\ Fuss, I. G. & Navarro, D. J. Open parallel cooperative and competitive decsision processes: A potential provenance for quantum probability decision models. Topics in Co 5(4), 818–843 (2013).

19. Kleiner, M. et al. What is new in psychtoolbox-3. Perception 36, 1 (2007).

\20.\ Merkle, E. C. & Steyvers, M. Choosing a strictly proper scoring rule. Decision Analysis 10, 292–304 (2013).

\21.\ Busemeyer, J. R. Choice behavior in a sequential decision-making task. Organizational Behavior and Human Performance 29, 175–207 (1982).

\22.\ Pike, A. R. Stochastic models of choice behavior: Response probabilities and latencies of nite markov chain systems. British Journal of Mathematical and Statistical Psychology 19, 15–32 (1966).

\23.\ Laming, D. R. Information theory of choice reaction time (Wiley, 1968).

\24.\ Ratcli , R. A theory of memory retrieval. Psychological eview 85, 59–108 (1978).

\25.\ Diederich, A. & Busemeyer, J. R. Simple matrix methods for analyzing di usion models of choice probability, choice response time, and simple response time. Journal of Mathematical Psychology 47, 304–322 (2003).

Acknowledgements

is research was supported by AFOSR Grant FA9550 15 1 0343 to J.R.B., an NSF Grant SBE 0955410 to T.J.P., and an NSF graduate research fellowship DGE 1424871 to PDK.

Author contributions

T.P. and P.K. conceived and conducted the experiment. J.B. analyzed results. All three helped write manuscript.

Competing interests

e authors declare no competing interests.

Additional information

Supplementary information is available for this paper at https://doi.org/10.1038/s41598-019-54383-9. Correspondence and requests for materials should be addressed to J.R.B.

Reprints and permissions information is available at www.nature.com/reprints.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional afliations.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Cre-

ative Commons license, and indicate if changes were made. e images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

© e Author(s) 2019

Scientific Reports | (2019) 9:18025 | https://doi.org/10.1038/s41598-019-54383-9

10

suai.ru/our-contacts

quantum machine learning

Accepted Manuscript

Data fusion using Hilbert space multi-dimensional models

Jerome Busemeyer, Zheng Wang

PII:

S0304-3975(17)30891-5

DOI:

https://doi.org/10.1016/j.tcs.2017.12.007

Reference:

TCS 11410

To appear in:

Theoretical Computer Science

Received date:

21

May 2017

Revised date:

24

November 2017

Accepted date:

4 December 2017

Please cite this article in press as: J. Busemeyer, Z. Wang, Data fusion using Hilbert space multi-dimensional models, Theoret. Comput. Sci. (2018), https://doi.org/10.1016/j.tcs.2017.12.007

This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

suai.ru/our-contacts

quantum machine learning

Data fusion using Hilbert Space Multi-Dimensional Models

Jerome Busemeyer

Indiana University

Zheng Wang

The Ohio State University

Abstract

General procedures for constructing, estimating, and testing Hilbert space multi-dimensional (HSM) models, built from quantum probability theory, are presented. HSM models can be applied to collections of K di erent contingency tables obtained from a set of p variables that are measured under di erent contexts. A context is defined by the measurement of a subset of the p variables that are used to form a table. HSM models provide a representation of the collection of K tables in a low dimensional vector space, even when no single joint probability distribution across the p variables exists. HSM models produce parameter estimates that provide a simple and informative interpretation of the complex collection of tables.

Keywords: quantum probability, Hilbert space, multidimensional models, contingency table analysis, data fusion

1. Introduction

When large data sets are collected from di erent contexts, often they can be summarized by collections of contingency tables or cross-tabulation tables. Suppose there are p di erent variables (Y1, . . . , Yp) that can be used to measure objects, or events, or people. It may not be possible to measure all p variables at once, and instead, only a subset of variables (Yk1 , . . . , Yks ), s ≤ p, can be measured at once. Each subset forms a context k of measurement [8] [16]. More than one context can be collected, which forms a collection of K data tables (T1, . . . , Tk, . . . TK ), each collected under a di erent context k.

Preprint submitted to Theoretical Computer Science

January 10, 2018

suai.ru/our-contacts

quantum machine learning

Each table Tk is a joint relative frequency, or contingency, table based on a subset of variables.

For example, a research problem could involve three variables (Y1, Y2, Y3), but some tables might include only some subset of the three variables. One context might involve the measurement of a single variable Y1 that has 5 values to form a 1 − way frequency table T1 composed of 5 frequencies. Another context could be used to form another 5 × 3 table T2, composed of joint frequencies for two variables (Y1, Y2). A third context could form another 3 × 2 table T3 containing variables (Y2, Y3), and fourth could form a 5 × 2 table containing variables (Y1, Y3).

A critical data fusion problem arises: How to integrate and synthesize these K di erent tables into a compressed, coherent, and interpretable representation? This question arises in relational data base theory [1], where the problem is to find a universal relation capable of reproducing a set of component relations defined on all the data. In statistics, the problem is to find a single latent p − way joint distribution of the observed variables that can reproduce the frequencies in the K di erent tables by marginalizing across variables in the p−way table [5]. Often Bayesian causal networks are used to reduce the number of latent probability parameters by imposing conditional independence assumptions [13]. Unfortunately, however, in many cases, no universal relation exists and no p − way joint distribution can reproduce the observed tables (see, e.g., [7] [2])! This occurs when the data tables violate consistency constraints required by classical (Kolmogorov) probability theory. In this case, no Bayesian network representation composed of the p-variables can even be formed. In the following sections, we give concrete examples of the various types of possible joint probability violations.

The data fusion problem presented above is not new. Discussions concerning the conditions for the existence of a single joint distribution to reproduce a collection of K di erent tables has a long history that goes all the way back to George Boole [26]. Vorob’ev [29] was one of the first to begin a rigorous program to identify all of these conditions. For example, the famous Bell inequality describes a condition required for 4 two-way tables constructed according to the Bell experimental design to be described by 4 binary random variables [17]. The relevance of this data fusion problem for human judgments was first pointed out by Aerts and Aerts [4]. Recent work has identified very general conditions required for arbitrary collections of tables [12] [15].

Hilbert space multi-dimensional (hereafter, denoted HSM) modeling is

2