Compare commits

..

5 commits

Author SHA1 Message Date
Paul ALNET
6ceddb79f2 tex: add more NFBP graphs 2023-06-04 23:26:48 +02:00
Paul ALNET
2dc8ab1b36 tex: add NFBP graphs 2023-06-04 23:09:26 +02:00
Paul ALNET
1b48dacbed tex: set defense date 2023-06-04 22:25:40 +02:00
Paul ALNET
bbe25c3d4c tex: NFDBP add expected value T_i 2023-06-04 22:16:30 +02:00
Paul ALNET
ff1b121be9 tex: finalize NFDBP demo 2023-06-04 21:20:38 +02:00
7 changed files with 110 additions and 57 deletions

View file

@ -368,6 +368,6 @@ def basic_demo():
)
stats_NFBP_iter(10**3, 10)
stats_NFBP_iter(10**5, 50)
print("\n\n")
stats_NFDBP(10**3, 10, 1)

View file

@ -184,7 +184,7 @@ results in the form of histograms.
\footnotetext{The code is available in Annex \ref{annex:probabilistic}}
We will try to approximate $ \mathbb{E}[X] $ and $ \mathbb{E}[V] $ with $
We will try to approximate $ \mathbb{E}[R] $ and $ \mathbb{E}[V] $ with $
\overline{X_N} $ using $ {S_n}^2 $. This operation will be done for both $ R =
2 $ and $ R = 10^6 $ simulations.
@ -206,63 +206,81 @@ variance and further determine the Confidence Interval (95 \% certainty).
\paragraph{2 simulations} We first ran $ R = 2 $ simulations to observe the
behavior of the algorithm and the low precision of the results.
% TODO graph T_i 2 sim
\begin{figure}[h]
\centering
\includegraphics[width=0.8\textwidth]{graphics/graphic-NFBP-Ti-2-sim}
\caption{Histogram of $ T_i $ for $ R = 2 $ simulations and $ N = 50 $ items (number of items per bin)}
\label{fig:graphic-NFBP-Ti-2-sim}
\end{figure}
On this graph, we can see each value of $ T_i $. Our calculations have yielded
that $ \overline{T_1} = 1.0 $ and $ {S_N}^2 = 2.7 $. Our student coefficient is
$ t_{0.95, 2} = 4.303 $.
On this graph (figure \ref{fig:graphic-NFBP-Ti-2-sim}), we can see each value
of $ T_i $. Our calculations have yielded that $ \overline{T_1} = 1.0 $ and $
{S_N}^2 = 2.7 $. Our Student coefficient is $ t_{0.95, 2} = 4.303 $.
We can now calculate the Confidence Interval for $ T_1 $ for $ R = 2 $ simulations :
\begin{align*}
\overline{T_1} = \sum_{k=1}^{2} {T_1}_k & = 1.0 \\
IC_{95\%}(T_1) & = \left[ 1.0 \pm 1.96 \frac{\sqrt{2.7}}{\sqrt{2}} \cdot 4.303 \right] \\
& = \left[ 1 \pm 9.8 \right] \\
IC_{95\%}(T_1) & = \left[ 1.0 \pm 1.96 \frac{\sqrt{2.7}}{\sqrt{2}} \cdot 4.303 \right] \\
& = \left[ 1 \pm 9.8 \right] \\
\end{align*}
With two simulations, we obtain $ \overline{T_1} = 1.0 $.
We can see that the Confidence Interval is very large, which is due to the low
number of simulations. Looking at figure \ref{fig:graphic-NFBP-Ti-2-sim}, we
easily notice the high variance.
\begin{figure}[h]
\centering
\includegraphics[width=0.8\textwidth]{graphics/graphic-NFBP-Vi-2-sim}
\caption{Histogram of $ V_i $ for $ R = 2 $ simulations and $ N = 50 $ items (size of the first item in a bin)}
\label{fig:graphic-NFBP-Vi-2-sim}
\end{figure}
On the graph of $ V_i $ (figure \ref{fig:graphic-NFBP-Vi-2-sim}), we can see
that the sizes are scattered pseudo-randomly between $ 0 $ and $ 1 $, which is
unsuprising given the low number of simulations. The process determinig the statistics
is the same as for $ T_i $, yielding $ \overline{V_1} = 0.897 $, $ {S_N}^2 =
0.2 $ and $ IC_{95\%}(V_1) = \left[ 0.897 \pm 1.3 \right] $. In this particular run,
the two values for $ V_1 $ are high (being bouded between $ 0 $ and $ 1 $).
\paragraph{100 000 simulations} In order to ensure better precision, we then
ran $ R = 10^5 $ simulations with $ N = 50 $ different items each.
IC observed
\begin{figure}[h]
\centering
\includegraphics[width=0.8\textwidth]{graphics/graphic-NFBP-Ti-105-sim}
\caption{Histogram of $ T_i $ for $ R = 10^5 $ simulations and $ N = 50 $ items (number of items per bin)}
\label{fig:graphic-NFBP-Ti-105-sim}
\end{figure}
We then ran $ R = 10^6 $ simulations with $ N = 50 $ different items each.
With 10 6 simulations, we obtain Xn barre = cf graphe
Calcul Sn carre
IC observed
On this graph (figure \ref{fig:graphic-NFBP-Ti-2-sim}), we can see each value
of $ T_i $. Our calculations have yielded that $ \overline{T_1} = 1.72 $ and $
{S_N}^2 = 0.88 $. Our Student coefficient is $ t_{0.95, 2} = 2 $.
We can now calculate the Confidence Interval for $ T_1 $ for $ R = 10^5 $ simulations :
Same for V.
\begin{align*}
IC_{95\%}(T_1) & = \left[ 1.72 \pm 1.96 \frac{\sqrt{0.88}}{\sqrt{10^5}} \cdot 2 \right] \\
& = \left[ 172 \pm 0.012 \right] \\
\end{align*}
We can see that the Confidence Interval is very small, thanks to the large number of iterations.
This results in a steady curve in figure \ref{fig:graphic-NFBP-Ti-105-sim}.
Graphe H
\paragraph{Distribution of $ T_i $} We first studied how many items were
present per bin.
% TODO sim of T_i
We determined the empirical mean to be
\[
\overline{T_i} = \frac{1}{20} \sum_{k=1}^{20} T_k = 1.5 \qquad \forall 1 \leq i \leq 20
\]
We can show
\paragraph{Distribution of $ V_i $} We then looked at the size of the first
item in each bin.
\begin{figure}[h]
\centering
\includegraphics[width=0.8\textwidth]{graphics/graphic-NFBP-Vi-105-sim}
\caption{Histogram of $ V_i $ for $ R = 10^5 $ simulations and $ N = 50 $ items (size of the first item in a bin)}
\label{fig:graphic-NFBP-Vi-105-sim}
\end{figure}
\paragraph{Asymptotic behavior of $ H_n $} Finally, we analyzed how many bins
were needed to store $ n $ items.
% TODO histograms
% TODO analysis histograms
were needed to store $ n $ items. We used the numbers from the $ R = 10^5 $ simulations.
\cite{hofri:1987}
% TODO mettre de l'Histoire
\section{Next Fit Dual Bin Packing algorithm (NFDBP)}
@ -312,25 +330,25 @@ new constraints on the first bin can be expressed as follows :
\subsection{La giga demo}
Let $ k \in \mathbb{N} $. Let $ (U_n)_{n \in \mathbb{N}} $ be a sequence of
Let $ k \geq 2 $. Let $ (U_n)_{n \in \mathbb{N}^*} $ be a sequence of
independent random variables with uniform distribution on $ [0, 1] $, representing
the size of the $ n $-th item.
Let $ i \in \mathbb{N} $. $ T_i $ denotes the number of items in the $ i $-th
bin. We have that
\begin{equation}
\begin{equation*}
T_i = k \iff U_1 + U_2 + \ldots + U_{k-1} < 1 \text{ and } U_1 + U_2 + \ldots + U_{k} \geq 1
\end{equation}
\end{equation*}
Let $ A_k = \{ U_1 + U_2 + \ldots + U_{k} < 1 \}$. Hence,
\begin{align*}
% TODO = k
\begin{align}
\label{eq:prob}
P(T_i = k)
& = P(A_{k-1} \cap A_k^c) \\
& = P(A_{k-1}) - P(A_k) \qquad \text{ (as $ A_k \subset A_{k-1} $)} \\
\end{align*}
\end{align}
We will try to show that $ \forall k \geq 1 $, $ P(A_k) = \frac{1}{k!} $. To do
so, we will use induction to prove the following proposition \eqref{eq:induction},
@ -373,23 +391,55 @@ $ f_{U_{k-1}}(y) = 1 $. We can then integrate by parts :
\begin{align*}
P(S_k < a)
& = \int_{0}^{a} f_{S_{k-1}}(x) \cdot \left( \int_{0}^{a-x} 1 dy \right) dx \\
& = \int_{0}^{a} f_{S_{k-1}}(x) \cdot (a - x) dx \\
& = a \int_{0}^{a} f_{S_{k-1}}(x) dx - \int_{0}^{a} x f_{S_{k-1}}(x) dx \\
& = \int_{0}^{a} f_{S_{k-1}}(x) \cdot \left( \int_{0}^{a-x} 1 dy \right) dx \\
& = \int_{0}^{a} f_{S_{k-1}}(x) \cdot (a - x) dx \\
& = a \int_{0}^{a} f_{S_{k-1}}(x) dx - \int_{0}^{a} x f_{S_{k-1}}(x) dx \\
& = a \int_0^a F'_{S_{k-1}}(x) dx - \left[ x F_{S_{k-1}}(x) \right]_0^a
+ \int_{0}^{a} x F_{S_{k-1}}(x) dx \qquad \text{(IPP)} \\
+ \int_{0}^{a} F_{S_{k-1}}(x) dx \qquad \text{(IPP : }x, F_{S_{k-1}} \in C^1([0,1]) \\
& = a \left[ F_{S_{k-1}}(x) \right]_0^a - \left[ x F_{S_{k-1}}(x) \right]_0^a
+ \int_{0}^{a} \frac{x^{k-1}}{(k-1)!} dx \\
& = \left[ \frac{x^k}{k!} \right]_0^a \\
& = \frac{a^k}{k!} \\
+ \int_{0}^{a} \frac{x^{k-1}}{(k-1)!} dx \\
& = \left[ \frac{x^k}{k!} \right]_0^a \\
& = \frac{a^k}{k!} \\
\end{align*}
We have shown that $ (\mathcal{H}_{k}) $ is true, so by induction, $ \forall k \geq 1 $,
\paragraph{Conclusion} We have shown that $ (\mathcal{H}_{k}) $ is true, so by induction, $ \forall k \geq 1 $,
$ \forall a \in [0, 1] $, $ P(U_1 + U_2 + \ldots + U_{k} < a) = \frac{a^k}{k!} $. Take
$ a = 1 $ to get $ P(U_1 + U_2 + \ldots + U_{k} < 1) = \frac{1}{k!} $.
$ a = 1 $ to get
\[ P(U_1 + U_2 + \ldots + U_{k} < 1) = \frac{1}{k!} \]
Finally, plugging this into \eqref{eq:prob} gives us
\[
P(T_i = k) = P(A_{k-1}) - P(A_{k}) = \frac{1}{(k-1)!} - \frac{1}{k!} \qquad \forall k \geq 2
\]
\subsection{Expected value of $ T_i $}
We now compute the expected value $ \mu $ and variance $ \sigma^2 $ of $ T_i $.
\begin{align*}
\mu = E(T_i) & = \sum_{k=2}^{\infty} k \cdot P(T_i = k) \\
& = \sum_{k=2}^{\infty} (\frac{k}{(k-1)!} - \frac{1}{(k-1)!}) \\
& = \sum_{k=2}^{\infty} \frac{k-1}{(k-1)!} \\
& = \sum_{k=0}^{\infty} \frac{1}{k!} \\
& = e \\
\end{align*}
\begin{align*}
E({T_i}^2) & = \sum_{k=2}^{\infty} k^2 \cdot P(T_i = k) \\
& = \sum_{k=2}^{\infty} (\frac{k^2}{(k-1)!} - \frac{k}{(k-1)!}) \\
& = \sum_{k=2}^{\infty} \frac{(k-1)k}{(k-1)!} \\
& = \sum_{k=2}^{\infty} \frac{k}{(k-2)!} \\
& = \sum_{k=0}^{\infty} \frac{k+2}{k!} \\
& = \sum_{k=0}^{\infty} (\frac{1}{(k-1)!} + \frac{2}{(k)!}) \\
& = \sum_{k=0}^{\infty} \frac{1}{(k)!} - 1 + 2e \\
& = 3e - 1
\end{align*}
\begin{align*}
\sigma^2 = E({T_i}^2) - E(T_i)^2 = 3e - 1 - e^2
\end{align*}
\section{Complexity and implementation optimization}
@ -473,7 +523,10 @@ then calculate the statistics (which iterates multiple times over the array).
\subsection{Optimal algorithm}
\cite{bin-packing-approximation:2022}
\sectionnn{Conclusion}
\nocite{bin-packing-approximation:2022}
\nocite{hofri:1987}

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 21 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 21 KiB

View file

@ -20,7 +20,7 @@
%\begin{center}
% -\hspace{0.25cm}Version du \today\hspace{0.25cm}-
%\end{center}
Defense on June Xth 2023 % TODO
Defense on June 7th 2023
}
\def\varinsaaddress{