Skip to content
Snippets Groups Projects
Commit 98e75cff authored by even's avatar even
Browse files

Article: equations revisited

parent 6e377eed
No related branches found
No related tags found
No related merge requests found
...@@ -6,7 +6,7 @@ ...@@ -6,7 +6,7 @@
\SetKwData{lm}{LocMax} \SetKwData{lm}{LocMax}
\SetKwData{nullset}{$\emptyset$} \SetKwData{nullset}{$\emptyset$}
\SetKwData{ortho}{$\vec{AB}_\perp$} \SetKwData{ortho}{$\vec{AB}_\perp$}
\SetKwData{eps}{$\varepsilon_{ini}$} \SetKwData{eps}{$2~\varepsilon_{ini}$}
\SetKwData{pta}{$A$} \SetKwData{pta}{$A$}
\SetKwData{ptb}{$B$} \SetKwData{ptb}{$B$}
\SetKwData{Result}{Result} \SetKwData{Result}{Result}
......
...@@ -28,7 +28,7 @@ The present work aims at designing a flexible tool to detect blurred segments ...@@ -28,7 +28,7 @@ The present work aims at designing a flexible tool to detect blurred segments
with optimal width and orientation in gray-level images for as well with optimal width and orientation in gray-level images for as well
supervised as unsupervised contexts. supervised as unsupervised contexts.
User-friendly solutions are sought, with ideally no parameter to set, User-friendly solutions are sought, with ideally no parameter to set,
or at least quite few values with intuitive meaning to an end user. or at least quite few values with intuitive meaning.
\subsection{Previous work} \subsection{Previous work}
......
...@@ -154,15 +154,15 @@ when the orientation is badly estimated (\RefFig{fig:escape} c). ...@@ -154,15 +154,15 @@ when the orientation is badly estimated (\RefFig{fig:escape} c).
\includegraphics[width=0.48\textwidth]{Fig_notions/escapeFirst_zoom.png} & \includegraphics[width=0.48\textwidth]{Fig_notions/escapeFirst_zoom.png} &
\includegraphics[width=0.48\textwidth]{Fig_notions/escapeSecond_zoom.png} \\ \includegraphics[width=0.48\textwidth]{Fig_notions/escapeSecond_zoom.png} \\
\multicolumn{2}{c}{ \multicolumn{2}{c}{
\includegraphics[width=0.78\textwidth]{Fig_notions/escapeThird_zoom.png}} \includegraphics[width=0.72\textwidth]{Fig_notions/escapeThird_zoom.png}}
\begin{picture}(1,1)(0,0) \begin{picture}(1,1)(0,0)
{\color{dwhite}{ {\color{dwhite}{
\put(-260,108.5){\circle*{8}} \put(-260,100.5){\circle*{8}}
\put(-86,108.5){\circle*{8}} \put(-86,100.5){\circle*{8}}
\put(-172,7.5){\circle*{8}} \put(-172,7.5){\circle*{8}}
}} }}
\put(-263,106){a} \put(-263,98){a}
\put(-89,106){b} \put(-89,98){b}
\put(-175,5){c} \put(-175,5){c}
\end{picture} \end{picture}
\end{tabular} \end{tabular}
...@@ -282,9 +282,10 @@ First the positions $M_j$ of the prominent local maxima of the gradient ...@@ -282,9 +282,10 @@ First the positions $M_j$ of the prominent local maxima of the gradient
magnitude found under the stroke are sorted from the highest to the lowest. magnitude found under the stroke are sorted from the highest to the lowest.
For each of them the main detection process is run with three modifications: For each of them the main detection process is run with three modifications:
\begin{enumerate} \begin{enumerate}
\item the initial detection takes $M_j$ and the orthogonal direction $AB_\perp$ \item the initial detection takes $M_j$ and the orthogonal direction
to the stroke as input to build a static scan of fixed width $\vec{AB}_\perp$ to the stroke as input to build a static scan of fixed width
$\varepsilon_{ini}$, and $M_j$ is used as start point of the blurred segment; $2~\varepsilon_{ini}$, and $M_j$ is used as start point of the blurred
segment;
\item the occupancy mask is filled in with the points of the detected blurred \item the occupancy mask is filled in with the points of the detected blurred
segments $\mathcal{B}_j''$ at the end of each successful detection; segments $\mathcal{B}_j''$ at the end of each successful detection;
\item points marked as occupied are rejected when selecting candidates for the \item points marked as occupied are rejected when selecting candidates for the
...@@ -357,21 +358,36 @@ to collect all the segments found under the stroke. ...@@ -357,21 +358,36 @@ to collect all the segments found under the stroke.
\input{Fig_method/algoAuto} \input{Fig_method/algoAuto}
The performance of the detector is illustrated in \RefFig{fig:evalAuto}b \RefFig{fig:evalAuto}b gives an idea of the automatic detection performance.
or in \RefFig{fig:noisy} where hardly perceptible edges are detected in this In the example of \RefFig{fig:noisy}, hardly perceptible edges are detected
quite textured image. When the initial value of the assigned width is small, despite of a quite textured context.
short edges are detected edges. Longer edges are detected if the initial Unsurpringly the length of the detected edges is linked to the initial
assigned width is larger, but the found segments incorporate a lot of value of the assigned width, but a large value also augments the rate
interfering outliers. of interfering outliers insertion.
\begin{figure}[h] \begin{figure}[h]
\center \center
\begin{tabular}{c@{\hspace{0.2cm}}c@{\hspace{0.2cm}}c} \begin{tabular}{c@{\hspace{0.1cm}}c@{\hspace{0.1cm}}c}
\includegraphics[width=0.32\textwidth]{Fig_method/parpaings.png} & \includegraphics[width=0.32\textwidth]{Fig_method/parpaings.png} &
\includegraphics[width=0.32\textwidth]{Fig_method/parpaings2.png} & \includegraphics[width=0.32\textwidth]{Fig_method/parpaings2.png} &
\includegraphics[width=0.32\textwidth]{Fig_method/parpaings3.png} \includegraphics[width=0.32\textwidth]{Fig_method/parpaings3.png}
\end{tabular} \end{tabular}
\caption{Automatic detection of blurred segments on a quite texture image.} \begin{picture}(1,1)(0,0)
{\color{dwhite}{
\put(-286,-25.5){\circle*{8}}
\put(-171,-25.5){\circle*{8}}
\put(-58,-25.5){\circle*{8}}
}}
\put(-288.5,-28){a}
\put(-173.5,-28){b}
\put(-60.5,-28){c}
\end{picture}
\caption{Automatic detection of blurred segments on a textured image.
a) the input image,
b) automatic detection result with initial assigned width set
to 3 pixels,
c) automatic detection result with initial assigned width set
to 8 pixels.}
\label{fig:noisy} \label{fig:noisy}
\end{figure} \end{figure}
......
...@@ -121,21 +121,20 @@ At each iteration $i$, the scans $S_i$ and $S_{-i}$ are successively processed. ...@@ -121,21 +121,20 @@ At each iteration $i$, the scans $S_i$ and $S_{-i}$ are successively processed.
A directional scan can be defined by its start scan $S_0$. A directional scan can be defined by its start scan $S_0$.
If $A(x_A,y_A)$ and $B(x_B,y_B)$ are the end points of $S_0$, If $A(x_A,y_A)$ and $B(x_B,y_B)$ are the end points of $S_0$,
the scan strip is defined by : and if we note $\delta_x = x_B - x_A$, $\delta_y = y_B - y_A$,
$c_1 = \delta_x\cdot x_A + \delta_y\cdot y_A$,
$c_2 = \delta_x\cdot x_B + \delta_y\cdot y_B$ and
$\nu_{AB} = max (|\delta_x|, |\delta_y|)$, it is then defined by
the following scan strip $\mathcal{D}^{A,B}$ and scan lines
$\mathcal{N}_i^{A,B}$:
\begin{equation} \begin{equation}
\mathcal{D}(A,B) = \left\{ \begin{array}{l}
\mathcal{L}(\delta_x,~ \delta_y,~ min (c1,c2),~ 1 + |c_1-c_2|) \mathcal{D}^{A,B} =
\end{equation} \mathcal{L}(\delta_x,~ \delta_y,~ min (c1,c2),~ 1 + |c_1-c_2|) \\
\noindent \mathcal{N}_i^{A,B} = \mathcal{L}(\delta_y,~ -\delta_x,~
where $\delta_x = x_B - x_A$, $\delta_y = y_B - y_A$,
$c_1 = \delta_x\cdot x_A + \delta_y\cdot y_A$ and
$c_2 = \delta_x\cdot x_B + \delta_y\cdot y_B$.
The scan line $\mathcal{N}_i$ is then defined by :
\begin{equation}
\mathcal{N}_i(A,B) = \mathcal{L}(\delta_y,~ -\delta_x,~
\delta_y\cdot x_A - \delta_x\cdot y_A + i\cdot \nu_{AB},~ \nu_{AB}) \delta_y\cdot x_A - \delta_x\cdot y_A + i\cdot \nu_{AB},~ \nu_{AB})
\end{array} \right.
\end{equation} \end{equation}
where $\nu_{AB} = max (|\delta_x|, |\delta_y|)$
%The scan lines length is $d_\infty(AB)$ or $d_\infty(AB)-1$, where $d_\infty$ %The scan lines length is $d_\infty(AB)$ or $d_\infty(AB)-1$, where $d_\infty$
%is the chessboard distance ($d_\infty = max (|d_x|,|d_y|)$). %is the chessboard distance ($d_\infty = max (|d_x|,|d_y|)$).
...@@ -143,15 +142,16 @@ where $\nu_{AB} = max (|\delta_x|, |\delta_y|)$ ...@@ -143,15 +142,16 @@ where $\nu_{AB} = max (|\delta_x|, |\delta_y|)$
%as the image bounds should also be processed anyway. %as the image bounds should also be processed anyway.
A directional scan can also be defined by its central point $C(x_C,y_C)$, A directional scan can also be defined by its central point $C(x_C,y_C)$,
its direction $\vec{D}(X_D,Y_D)$ and its width $w$. The scan strip is : its direction $\vec{D}(X_D,Y_D)$ and its width $w$. If we note
\begin{equation} $c_3 = x_C\cdot Y_D - y_C\cdot X_D$ and
\mathcal{D}(C,\vec{D},w) $c_4 = X_D\cdot x_C + Y_D\cdot y_C$, it is then defined by
= \mathcal{L}(Y_D,~ -X_D,~ x_C\cdot Y_D - y_C\cdot X_D - w / 2,~ w) the following scan strip $\mathcal{D}^{C,\vec{D},w}$ and scan lines
\end{equation} $\mathcal{N}_i^{C,\vec{D},w}$:
\noindent
and the scan line $\mathcal{N}_i(C,\vec{D},w)$ :
\begin{equation} \begin{equation}
\mathcal{N}_i(C,\vec{D},w) = \mathcal{L}(X_D,~ Y_D,~ \left\{ \begin{array}{l}
X_D\cdot x_C + Y_D\cdot y_C - w / 2 + i\cdot w,~ max (|X_D|,|Y_D|) \mathcal{D}^{C,\vec{D},w}
= \mathcal{L}(Y_D,~ -X_D,~ c_3 - w / 2,~ w) \\
\mathcal{N}_i^{C,\vec{D},w} = \mathcal{L}(X_D,~ Y_D,~
c_4 - w / 2 + i\cdot w,~ max (|X_D|,|Y_D|)
\end{array} \right.
\end{equation} \end{equation}
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment