Skip to content
Snippets Groups Projects
Commit 98e75cff authored by even's avatar even
Browse files

Article: equations revisited

parent 6e377eed
No related branches found
No related tags found
No related merge requests found
......@@ -6,7 +6,7 @@
\SetKwData{lm}{LocMax}
\SetKwData{nullset}{$\emptyset$}
\SetKwData{ortho}{$\vec{AB}_\perp$}
\SetKwData{eps}{$\varepsilon_{ini}$}
\SetKwData{eps}{$2~\varepsilon_{ini}$}
\SetKwData{pta}{$A$}
\SetKwData{ptb}{$B$}
\SetKwData{Result}{Result}
......
......@@ -28,7 +28,7 @@ The present work aims at designing a flexible tool to detect blurred segments
with optimal width and orientation in gray-level images for as well
supervised as unsupervised contexts.
User-friendly solutions are sought, with ideally no parameter to set,
or at least quite few values with intuitive meaning to an end user.
or at least quite few values with intuitive meaning.
\subsection{Previous work}
......
......@@ -154,15 +154,15 @@ when the orientation is badly estimated (\RefFig{fig:escape} c).
\includegraphics[width=0.48\textwidth]{Fig_notions/escapeFirst_zoom.png} &
\includegraphics[width=0.48\textwidth]{Fig_notions/escapeSecond_zoom.png} \\
\multicolumn{2}{c}{
\includegraphics[width=0.78\textwidth]{Fig_notions/escapeThird_zoom.png}}
\includegraphics[width=0.72\textwidth]{Fig_notions/escapeThird_zoom.png}}
\begin{picture}(1,1)(0,0)
{\color{dwhite}{
\put(-260,108.5){\circle*{8}}
\put(-86,108.5){\circle*{8}}
\put(-260,100.5){\circle*{8}}
\put(-86,100.5){\circle*{8}}
\put(-172,7.5){\circle*{8}}
}}
\put(-263,106){a}
\put(-89,106){b}
\put(-263,98){a}
\put(-89,98){b}
\put(-175,5){c}
\end{picture}
\end{tabular}
......@@ -282,9 +282,10 @@ First the positions $M_j$ of the prominent local maxima of the gradient
magnitude found under the stroke are sorted from the highest to the lowest.
For each of them the main detection process is run with three modifications:
\begin{enumerate}
\item the initial detection takes $M_j$ and the orthogonal direction $AB_\perp$
to the stroke as input to build a static scan of fixed width
$\varepsilon_{ini}$, and $M_j$ is used as start point of the blurred segment;
\item the initial detection takes $M_j$ and the orthogonal direction
$\vec{AB}_\perp$ to the stroke as input to build a static scan of fixed width
$2~\varepsilon_{ini}$, and $M_j$ is used as start point of the blurred
segment;
\item the occupancy mask is filled in with the points of the detected blurred
segments $\mathcal{B}_j''$ at the end of each successful detection;
\item points marked as occupied are rejected when selecting candidates for the
......@@ -357,21 +358,36 @@ to collect all the segments found under the stroke.
\input{Fig_method/algoAuto}
The performance of the detector is illustrated in \RefFig{fig:evalAuto}b
or in \RefFig{fig:noisy} where hardly perceptible edges are detected in this
quite textured image. When the initial value of the assigned width is small,
short edges are detected edges. Longer edges are detected if the initial
assigned width is larger, but the found segments incorporate a lot of
interfering outliers.
\RefFig{fig:evalAuto}b gives an idea of the automatic detection performance.
In the example of \RefFig{fig:noisy}, hardly perceptible edges are detected
despite of a quite textured context.
Unsurpringly the length of the detected edges is linked to the initial
value of the assigned width, but a large value also augments the rate
of interfering outliers insertion.
\begin{figure}[h]
\center
\begin{tabular}{c@{\hspace{0.2cm}}c@{\hspace{0.2cm}}c}
\begin{tabular}{c@{\hspace{0.1cm}}c@{\hspace{0.1cm}}c}
\includegraphics[width=0.32\textwidth]{Fig_method/parpaings.png} &
\includegraphics[width=0.32\textwidth]{Fig_method/parpaings2.png} &
\includegraphics[width=0.32\textwidth]{Fig_method/parpaings3.png}
\end{tabular}
\caption{Automatic detection of blurred segments on a quite texture image.}
\begin{picture}(1,1)(0,0)
{\color{dwhite}{
\put(-286,-25.5){\circle*{8}}
\put(-171,-25.5){\circle*{8}}
\put(-58,-25.5){\circle*{8}}
}}
\put(-288.5,-28){a}
\put(-173.5,-28){b}
\put(-60.5,-28){c}
\end{picture}
\caption{Automatic detection of blurred segments on a textured image.
a) the input image,
b) automatic detection result with initial assigned width set
to 3 pixels,
c) automatic detection result with initial assigned width set
to 8 pixels.}
\label{fig:noisy}
\end{figure}
......
......@@ -121,21 +121,20 @@ At each iteration $i$, the scans $S_i$ and $S_{-i}$ are successively processed.
A directional scan can be defined by its start scan $S_0$.
If $A(x_A,y_A)$ and $B(x_B,y_B)$ are the end points of $S_0$,
the scan strip is defined by :
and if we note $\delta_x = x_B - x_A$, $\delta_y = y_B - y_A$,
$c_1 = \delta_x\cdot x_A + \delta_y\cdot y_A$,
$c_2 = \delta_x\cdot x_B + \delta_y\cdot y_B$ and
$\nu_{AB} = max (|\delta_x|, |\delta_y|)$, it is then defined by
the following scan strip $\mathcal{D}^{A,B}$ and scan lines
$\mathcal{N}_i^{A,B}$:
\begin{equation}
\mathcal{D}(A,B) =
\mathcal{L}(\delta_x,~ \delta_y,~ min (c1,c2),~ 1 + |c_1-c_2|)
\end{equation}
\noindent
where $\delta_x = x_B - x_A$, $\delta_y = y_B - y_A$,
$c_1 = \delta_x\cdot x_A + \delta_y\cdot y_A$ and
$c_2 = \delta_x\cdot x_B + \delta_y\cdot y_B$.
The scan line $\mathcal{N}_i$ is then defined by :
\begin{equation}
\mathcal{N}_i(A,B) = \mathcal{L}(\delta_y,~ -\delta_x,~
\left\{ \begin{array}{l}
\mathcal{D}^{A,B} =
\mathcal{L}(\delta_x,~ \delta_y,~ min (c1,c2),~ 1 + |c_1-c_2|) \\
\mathcal{N}_i^{A,B} = \mathcal{L}(\delta_y,~ -\delta_x,~
\delta_y\cdot x_A - \delta_x\cdot y_A + i\cdot \nu_{AB},~ \nu_{AB})
\end{array} \right.
\end{equation}
where $\nu_{AB} = max (|\delta_x|, |\delta_y|)$
%The scan lines length is $d_\infty(AB)$ or $d_\infty(AB)-1$, where $d_\infty$
%is the chessboard distance ($d_\infty = max (|d_x|,|d_y|)$).
......@@ -143,15 +142,16 @@ where $\nu_{AB} = max (|\delta_x|, |\delta_y|)$
%as the image bounds should also be processed anyway.
A directional scan can also be defined by its central point $C(x_C,y_C)$,
its direction $\vec{D}(X_D,Y_D)$ and its width $w$. The scan strip is :
\begin{equation}
\mathcal{D}(C,\vec{D},w)
= \mathcal{L}(Y_D,~ -X_D,~ x_C\cdot Y_D - y_C\cdot X_D - w / 2,~ w)
\end{equation}
\noindent
and the scan line $\mathcal{N}_i(C,\vec{D},w)$ :
its direction $\vec{D}(X_D,Y_D)$ and its width $w$. If we note
$c_3 = x_C\cdot Y_D - y_C\cdot X_D$ and
$c_4 = X_D\cdot x_C + Y_D\cdot y_C$, it is then defined by
the following scan strip $\mathcal{D}^{C,\vec{D},w}$ and scan lines
$\mathcal{N}_i^{C,\vec{D},w}$:
\begin{equation}
\mathcal{N}_i(C,\vec{D},w) = \mathcal{L}(X_D,~ Y_D,~
X_D\cdot x_C + Y_D\cdot y_C - w / 2 + i\cdot w,~ max (|X_D|,|Y_D|)
\left\{ \begin{array}{l}
\mathcal{D}^{C,\vec{D},w}
= \mathcal{L}(Y_D,~ -X_D,~ c_3 - w / 2,~ w) \\
\mathcal{N}_i^{C,\vec{D},w} = \mathcal{L}(X_D,~ Y_D,~
c_4 - w / 2 + i\cdot w,~ max (|X_D|,|Y_D|)
\end{array} \right.
\end{equation}
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment