Skip to content

Commit 5624232

Browse files
committed
oskar stuff
1 parent 91d7cf4 commit 5624232

2 files changed

Lines changed: 246 additions & 0 deletions

File tree

140 KB
Binary file not shown.
Lines changed: 246 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,246 @@
1+
\documentclass[%
2+
oneside, % oneside: electronic viewing, twoside: printing
3+
final, % draft: marks overfull hboxes, figures with paths
4+
10pt]{article}
5+
6+
\listfiles % print all files needed to compile this document
7+
8+
\usepackage{relsize,makeidx,color,setspace,amsmath,amsfonts,amssymb}
9+
\usepackage[table]{xcolor}
10+
\usepackage{bm,ltablex,microtype}
11+
12+
\usepackage[pdftex]{graphicx}
13+
14+
\usepackage[T1]{fontenc}
15+
%\usepackage[latin1]{inputenc}
16+
\usepackage{ucs}
17+
\usepackage[utf8x]{inputenc}
18+
19+
\usepackage{lmodern} % Latin Modern fonts derived from Computer Modern
20+
21+
% Hyperlinks in PDF:
22+
\definecolor{linkcolor}{rgb}{0,0,0.4}
23+
\usepackage{hyperref}
24+
\hypersetup{
25+
breaklinks=true,
26+
colorlinks=true,
27+
linkcolor=linkcolor,
28+
urlcolor=linkcolor,
29+
citecolor=black,
30+
filecolor=black,
31+
%filecolor=blue,
32+
pdfmenubar=true,
33+
pdftoolbar=true,
34+
bookmarksdepth=3 % Uncomment (and tweak) for PDF bookmarks with more levels than the TOC
35+
}
36+
\setcounter{tocdepth}{2} % levels in table of contents
37+
38+
\clubpenalty = 10000
39+
\widowpenalty = 10000
40+
\raggedbottom
41+
\makeindex
42+
\usepackage[totoc]{idxlayout} % for index in the toc
43+
\usepackage[nottoc]{tocbibind} % for references/bibliography in the toc
44+
\begin{document}
45+
46+
\thispagestyle{empty}
47+
48+
\begin{center}
49+
{\LARGE\bf
50+
\begin{spacing}{1.25}
51+
Quantum Machine Learning for Finance
52+
\end{spacing}
53+
}
54+
\end{center}
55+
56+
\begin{center}
57+
{\bf Master of Science thesis project${}^{}$} \\ [0mm]
58+
\end{center}
59+
60+
\begin{center}
61+
% List of all institutions:
62+
\end{center}
63+
64+
\begin{center}
65+
May 4, 2025
66+
\end{center}
67+
68+
\vspace{1cm}
69+
70+
71+
\subsection*{Quantum Computing and Machine Learning}
72+
73+
\textbf{Quantum Computing and Machine Learning} are two of the most promising
74+
approaches for studying complex systems with many degrees of freedom.
75+
76+
Quantum computing is an emerging area of computer science that
77+
leverages the principles of quantum mechanics to perform computations
78+
beyond the capabilities of classical computers. Unlike classical
79+
computers, which use bits to represent data as bits $0$ or $1$,
80+
quantum computers use quantum bits, or qubits. Qubits can exist in
81+
multiple states simultaneously (superposition) and can be entangled
82+
with one another, allowing quantum computers to process vast amounts
83+
of information in parallel.
84+
85+
These unique properties enable quantum computers to tackle problems
86+
that are currently intractable for classical systems, such as complex
87+
simulations in chemistry and physics, optimization problems, and
88+
large-scale data analysis.
89+
90+
Quantum machine learning (QML) is an interdisciplinary field that
91+
combines quantum computing with machine learning techniques. The goal
92+
is to enhance the performance of machine learning algorithms by
93+
utilizing quantum computing’s capabilities.
94+
95+
In QML, quantum algorithms are developed to process and analyze data
96+
more efficiently than classical algorithms. This includes tasks like
97+
classification, regression, clustering, and dimensionality
98+
reduction. By exploiting quantum phenomena, QML has the potential to
99+
accelerate machine learning processes and handle larger datasets more
100+
effectively.
101+
102+
Quantum computing and QML hold promise for many different types of applications, including:
103+
104+
\begin{enumerate}
105+
\item Drug Discovery: Simulating molecular structures to expedite the development of new medications.
106+
107+
\item Financial Modeling: Optimizing portfolios and detecting fraudulent activities through complex data analysis.
108+
109+
\item Artificial Intelligence: Enhancing machine learning algorithms for faster and more accurate predictions.
110+
\end{enumerate}
111+
112+
As quantum hardware continues to advance, the integration of quantum
113+
computing into practical applications is becoming increasingly
114+
feasible, opening up for a new era of computational possibilities.
115+
116+
This thesis project deals with the study and implementation of quantum
117+
machine learning methods applied to classical machine learning data
118+
for supervised learning. The methods we will focus on are
119+
120+
\begin{enumerate}
121+
\item Support vector machines and quantum support vector machines
122+
123+
\item Neural networks and quantum neural networks and possibly (if time allows)
124+
125+
\item Classical and quantum Boltzmann machines
126+
\end{enumerate}
127+
128+
129+
The data sets will span both regression and classification problems,
130+
with an emphasis on simulating time series, in particular of relevance
131+
for financial problems. The thesis will be done in close collaboration
132+
with \textbf{Norges Bank Invenstment Management, Simula Research
133+
laboratory and the University of Oslo}.
134+
135+
\subsection*{Support vector machines}
136+
137+
A central model in classical supervised learning is the support vector
138+
machine (SVM), which is a maximal-margin classifier. SVMs are widely
139+
used for binary classification and have extensions to regression
140+
problems as well. They build on statistical learning theory and are
141+
known for finding decision boundaries with maximal margin. In
142+
particular, SVMs can perform non-linear classification by employing
143+
the kernel trick, which implicitly maps data into a high-dimensional
144+
feature space via a kernel function.
145+
146+
A Quantum Support Vector Machine (QSVM) replaces the classical kernel
147+
or feature map with a quantum procedure. In QSVM, classical data
148+
points $\bm{x}$ are encoded into quantum states $|\phi(\bm{x})\rangle$
149+
via a quantum feature map (a parameterized quantum circuit). The
150+
inner product (overlap) between two such states serves as a quantum
151+
kernel, measuring data similarity in a high-dimensional Hilbert space.
152+
153+
\subsection*{Quantum Neural Networks and Variational Circuits}
154+
155+
The Variational Quantum Algorithm (VQA) is a Variational Quantum
156+
Circuit (VQC), that is a quantum circuit with tunable parameters and
157+
which is trained using a classical optimizer. In practice, a VQC
158+
(also called a Parameterized Quantum Circuit (PQC)) is used as a
159+
Quantum Neural Network (QNN): data are encoded into quantum states, a
160+
parameterized circuit is applied, and measurements yield outputs. For
161+
example, it has been shown recently that certain QNNs can exhibit
162+
higher effective dimension (and thus capacity to generalize) than
163+
comparable classical networks, suggesting a potential quantum
164+
advantage.
165+
166+
\subsection*{Boltzmann machines}
167+
168+
Boltzmann Machines (BMs) offer a powerful framework for modeling
169+
probability distributions. These types of neural networks use an
170+
undirected graph-structure to encode relevant information. More
171+
precisely, the respective information is stored in bias coefficients
172+
and connection weights of network nodes, which are typically related
173+
to binary spin-systems and grouped into those that determine the
174+
output, the visible nodes, and those that act as latent variables, the
175+
hidden nodes. The aim of BM training is to learn a set of weights
176+
such that the resulting model approximates a target probability
177+
distribution which is implicitly given by training data. This setting
178+
can be formulated as discriminative as well as generative learning
179+
task. Applications have been studied in a large variety of domains
180+
such as the analysis of quantum many-body systems, statistics,
181+
biochemistry, social networks, signal processing and finance
182+
183+
Quantum Boltzmann Machines (QBMs) are a natural adaption of BMs to the
184+
quantum computing framework. Instead of an energy function with nodes
185+
being represented by binary spin values, QBMs define the underlying
186+
network using a Hermitian operator, normally a parameterized
187+
Hamiltonian.
188+
189+
\paragraph{Specific tasks and milestones.}
190+
191+
The aim of this thesis is to study the implementation and development
192+
of codes for several quantum machine learning methods, including
193+
quantum support vector machines, quantum neural networks and possibly
194+
Boltzmann machines, if time allows. The results will be compared with
195+
those from their classical counterparts. The final aim is to study
196+
data from finance with both classical and quantum Machine Learning
197+
algorithms in order to assess and test quantum machine learning
198+
algorithms and their potential for the analysis of data from finance.
199+
In setting up the algorithms, existing software libraries like
200+
Scikit-Learn, PennyLane, Qiskit and other will be used. This will
201+
allow for an efficient development and study of both classical and
202+
quantum machine learning algorithms.
203+
204+
The thesis consists of three basic steps:
205+
206+
\begin{enumerate}
207+
\item Develop a classical machine framework for studies of supervised classification and regression problems, with an emphasis on data from finance. The main emphasis rests on deep learning methods (neural networks, Boltzmann machines and recurrent neural networks) and support vector machines.
208+
209+
\item Compare and evaluate the results from the classical machine learning methods and assess their relevance for financial data.
210+
211+
\item Develop and implement codes for quantum machine learning algorithms (quantum support vector machines, quantum neural networks and possibly quantum Boltzmann machines) to be run on existing quantum computers and classical computers. Compare the performance of the quantum machine learning with the abovementioned classical methods with an emphasis on financial data.
212+
\end{enumerate}
213+
214+
\noindent
215+
The milestones are:
216+
\begin{enumerate}
217+
\item Spring 2025: Study basic quantum machine learning algorithms (quantum support vector machines, quantum neural networks) for simpler supervised problems from finance and/or other fields.
218+
219+
\item Spring 2025: Compare the results of the simpler data sets with classical machine learning methods
220+
221+
\item Fall 2025: Set uo final data from finance to be analyzed with classical and quantum machine learning algorithms
222+
223+
\item Fall 2025: Develop a software framework which includes quantum support vector machines and quantum neural networks.
224+
225+
\item Spring 2026: The final part is to include Quantum Boltzmann machines, if time allows, and analyze the results from the diffirent methods. Finalize thesis.
226+
\end{enumerate}
227+
228+
\noindent
229+
The thesis is expected to be handed in May/June 2026.
230+
231+
\paragraph{Literature.}
232+
\begin{enumerate}
233+
234+
235+
\item Maria Schuld and Francesco Petruccione, \textbf{Supervised Learning with Quantum Computers}, Springer, 2018.
236+
237+
\item Claudio Conti, Quantum Machine Learning (Springer), see \href{{https://link.springer.com/book/10.1007/978-3-031-44226-1}}{\nolinkurl{https://link.springer.com/book/10.1007/978-3-031-44226-1}}.
238+
239+
\item M. Zhao et al., \textbf{A tutorial on quantum machine learning and quantum neural networks}, arXiv:2504.16131 (2025)
240+
\item Amin et al., \textbf{Quantum Boltzmann Machines}, Physical Review X \textbf{8}, 021050 (2018).
241+
242+
\item Morten Hjorth-Jensen, Quantum Computing and Quantum Machine Learning, lecture notes with extensive codes at \href{{https://github.com/CompPhysics/QuantumComputingMachineLearning}}{\nolinkurl{https://github.com/CompPhysics/QuantumComputingMachineLearning}}, in particular the last five sets of lectures.
243+
\end{enumerate}
244+
245+
\end{document}
246+

0 commit comments

Comments
 (0)