Skip to content

Commit 91d7cf4

Browse files
committed
update
1 parent 16e4fad commit 91d7cf4

7 files changed

Lines changed: 158 additions & 121 deletions

File tree

doc/pub/QCML/html/QCML-bs.html

Lines changed: 35 additions & 28 deletions
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,8 @@
88
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
99
<meta name="generator" content="DocOnce: https://github.com/doconce/doconce/" />
1010
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
11-
<meta name="description" content="Quantum Machine Learning">
12-
<title>Quantum Machine Learning</title>
11+
<meta name="description" content="Quantum Machine Learning for Finance">
12+
<title>Quantum Machine Learning for Finance</title>
1313
<!-- Bootstrap style: bootstrap -->
1414
<!-- doconce format html QCML.do.txt --html_style=bootstrap --pygments_html_style=default --html_admon=bootstrap_panel --html_output=QCML-bs -->
1515
<link href="https://netdna.bootstrapcdn.com/bootstrap/3.1.1/css/bootstrap.min.css" rel="stylesheet">
@@ -78,7 +78,7 @@
7878
<span class="icon-bar"></span>
7979
<span class="icon-bar"></span>
8080
</button>
81-
<a class="navbar-brand" href="QCML-bs.html">Quantum Machine Learning</a>
81+
<a class="navbar-brand" href="QCML-bs.html">Quantum Machine Learning for Finance</a>
8282
</div>
8383
<div class="navbar-collapse collapse navbar-responsive-collapse">
8484
<ul class="nav navbar-nav navbar-right">
@@ -103,7 +103,7 @@
103103
<!-- ------------------- main content ---------------------- -->
104104
<div class="jumbotron">
105105
<center>
106-
<h1>Quantum Machine Learning</h1>
106+
<h1>Quantum Machine Learning for Finance</h1>
107107
</center> <!-- document title -->
108108

109109
<!-- author(s): Master of Science thesis project -->
@@ -122,14 +122,13 @@ <h4>May 4, 2025</h4>
122122
<h2 id="quantum-computing-and-machine-learning" class="anchor">Quantum Computing and Machine Learning </h2>
123123

124124
<p><b>Quantum Computing and Machine Learning</b> are two of the most promising
125-
approaches for studying complex physical systems where several length
126-
and energy scales are involved.
125+
approaches for studying complex systems with many degrees of freedom.
127126
</p>
128127

129128
<p>Quantum computing is an emerging area of computer science that
130129
leverages the principles of quantum mechanics to perform computations
131130
beyond the capabilities of classical computers. Unlike classical
132-
computers, which use bits to represent data as 0s or 1s, quantum
131+
computers, which use bits to represent data as bits \( 0 \) or \( 1 \), quantum
133132
computers use quantum bits, or qubits. Qubits can exist in multiple
134133
states simultaneously (superposition) and can be entangled with one
135134
another, allowing quantum computers to process vast amounts of
@@ -156,7 +155,7 @@ <h2 id="quantum-computing-and-machine-learning" class="anchor">Quantum Computing
156155
effectively.
157156
</p>
158157

159-
<p>Quantum computing and QML hold promise for various applications, including:</p>
158+
<p>Quantum computing and QML hold promise for many different types of applications, including:</p>
160159

161160
<ol>
162161
<li> Drug Discovery: Simulating molecular structures to expedite the development of new medications.</li>
@@ -179,16 +178,16 @@ <h2 id="quantum-computing-and-machine-learning" class="anchor">Quantum Computing
179178
<li> Classical and quantum Boltzmann machines</li>
180179
</ol>
181180
<p>The data sets will span both regression and classification problems,
182-
with an emphasis on simulating time series of relevance for financial
183-
problems. The thesis will be done in close collaboration with Norges
181+
with an emphasis on simulating time series, in particular of relevance for financial
182+
problems. The thesis will be done in close collaboration with <b>Norges
184183
Bank Invenstment Management, Simula Research laboratory and the
185-
University of Oslo.
184+
University of Oslo</b>.
186185
</p>
187186
<h2 id="support-vector-machines" class="anchor">Support vector machines </h2>
188187

189188
<p>A central model in classical
190189
supervised learning is the support vector machine (SVM), which is a
191-
max-margin classifier. SVMs are widely used for binary classification
190+
maximal-margin classifier. SVMs are widely used for binary classification
192191
and have extensions to regression problems as well.
193192
They build on statistical learning
194193
theory and are known for finding decision boundaries with maximal
@@ -215,7 +214,7 @@ <h2 id="quantum-neural-networks-and-variational-circuits" class="anchor">Quantum
215214
parameterized circuit is applied, and measurements yield outputs.
216215
For example, it has been shown recently that certain QNNs can exhibit higher
217216
effective dimension (and thus capacity to generalize) than comparable
218-
classical networks , suggesting a potential quantum advantage.
217+
classical networks, suggesting a potential quantum advantage.
219218
</p>
220219
<h2 id="boltzmann-machines" class="anchor">Boltzmann machines </h2>
221220

@@ -227,12 +226,7 @@ <h2 id="boltzmann-machines" class="anchor">Boltzmann machines </h2>
227226
to binary spin-systems and grouped into those that determine the
228227
output, the visible nodes, and those that act as latent variables, the
229228
hidden nodes.
230-
</p>
231-
232-
<p>Furthermore, the network structure is linked to an energy function
233-
which facilitates the definition of a probability distribution over
234-
the possible node configurations by using a concept from statistical
235-
mechanics, i.e., Gibbs states. The aim of BM training is to learn a
229+
The aim of BM training is to learn a
236230
set of weights such that the resulting model approximates a target
237231
probability distribution which is implicitly given by training data.
238232
This setting can be formulated as discriminative as well as generative
@@ -244,24 +238,36 @@ <h2 id="boltzmann-machines" class="anchor">Boltzmann machines </h2>
244238
<p>Quantum Boltzmann Machines (QBMs) are a natural adaption of BMs to the
245239
quantum computing framework. Instead of an energy function with nodes
246240
being represented by binary spin values, QBMs define the underlying
247-
network using a Hermitian operator, normally a parameterized Hamiltonian, see reference [1] below.
241+
network using a Hermitian operator, normally a parameterized Hamiltonian.
248242
</p>
249243
<h3 id="specific-tasks-and-milestones" class="anchor">Specific tasks and milestones </h3>
250244

251-
<p>The aim of this thesis is to study the implementation and development of codes for
252-
several quantum machine learning methods, including quantum support vector machines, quantum neural networks and possibly Boltzmann machines, and possibly other classical machine learning algorithms, on a quantum computer. The thesis consists of three basic steps:
245+
<p>The aim of this thesis is to study the implementation and development
246+
of codes for several quantum machine learning methods, including
247+
quantum support vector machines, quantum neural networks and possibly
248+
Boltzmann machines, if time allows. The results will be compared with
249+
those from their classical counterparts. The final aim is to study
250+
data from finance with both classical and quantum Machine Learning
251+
algorithms in order to assess and test quantum machine learning
252+
algorithms and their potential for the analysis of data from finance.
253+
In setting up the algorithms, existing software libraries like
254+
Scikit-Learn, PennyLane, Qiskit and other will be used. This will allow for an efficient development and study of both classical and quantum machine learning algorithms.
253255
</p>
254256

257+
<p>The thesis consists of three basic steps:</p>
258+
255259
<ol>
256-
<li> Develop a classical machine code for studies of classification and regression problems.</li>
257-
<li> Compare the results from the classical Boltzmann machine with other deep learning methods.</li>
258-
<li> Develop an implementation of a quantum Boltzmann machine code to be run on existing quantum computers and classical computers. Compare the performance of the quantum Boltzmann machines with exisiting classical deep learning methods.</li>
260+
<li> Develop a classical machine framework for studies of supervised classification and regression problems, with an emphasis on data from finance. The main emphasis rests on deep learning methods (neural networks, Boltzmann machines and recurrent neural networks) and support vector machines.</li>
261+
<li> Compare and evaluate the results from the classical machine learning methods and assess their relevance for financial data.</li>
262+
<li> Develop and implement codes for quantum machine learning algorithms (quantum support vector machines, quantum neural networks and possibly quantum Boltzmann machines) to be run on existing quantum computers and classical computers. Compare the performance of the quantum machine learning with the abovementioned classical methods with an emphasis on financial data.</li>
259263
</ol>
260264
<p>The milestones are:</p>
261265
<ol>
262-
<li> Spring 2025: Develop a code for classical Boltzmann machines to be applied to both classification and regression problems. In particular, the latter type of problem can be tailored to solving classical spin problems like the Ising model or quantum mechanical problems.</li>
263-
<li> Fall 2025: Develop a code for variational Quantum Boltzmann machines following reference [2] here. Make comparisons with classical deep learning algorithms on selected classification and regression problems.</li>
264-
<li> Spring 2026: The final part is to use the variational Quantum Boltzmann machines to study quantum mechanical systems. Finalize thesis.</li>
266+
<li> Spring 2025: Study basic quantum machine learning algorithms (quantum support vector machines, quantum neural networks) for simpler supervised problems from finance and/or other fields.</li>
267+
<li> Spring 2025: Compare the results of the simpler data sets with classical machine learning methods</li>
268+
<li> Fall 2025: Set uo final data from finance to be analyzed with classical and quantum machine learning algorithms</li>
269+
<li> Fall 2025: Develop a software framework which includes quantum support vector machines and quantum neural networks.</li>
270+
<li> Spring 2026: The final part is to include Quantum Boltzmann machines, if time allows, and analyze the results from the diffirent methods. Finalize thesis.</li>
265271
</ol>
266272
<p>The thesis is expected to be handed in May/June 2026.</p>
267273
<h3 id="literature" class="anchor">Literature </h3>
@@ -270,6 +276,7 @@ <h3 id="literature" class="anchor">Literature </h3>
270276
<li> Amin et al., <b>Quantum Boltzmann Machines</b>, Physical Review X <b>8</b>, 021050 (2018).</li>
271277
<li> Maria Schuld and Francesco Petruccione, <b>Supervised Learning with Quantum Computers</b>, Springer, 2018.</li>
272278
<li> Claudio Conti, Quantum Machine Learning (Springer), sections 1.5-1.12 and chapter 2, see <a href="https://link.springer.com/book/10.1007/978-3-031-44226-1" target="_self"><tt>https://link.springer.com/book/10.1007/978-3-031-44226-1</tt></a>.</li>
279+
<li> Morten Hjorth-Jensen, Quantum Computing and Quantum Machine Learning, lecture notes with extensive codes at <a href="https://github.com/CompPhysics/QuantumComputingMachineLearning" target="_self"><tt>https://github.com/CompPhysics/QuantumComputingMachineLearning</tt></a>, in particular the last five sets of lectures.</li>
273280
</ol>
274281
<!-- ------------------- end of main content --------------- -->
275282
</div> <!-- end container -->

0 commit comments

Comments
 (0)