Difference: WebcamDetector (1 vs. 37)

Revision 372015-09-07 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 458 to 458
 

SHIFT INSTRUCTION

Changed:
<
<
Login as a daquser to pcbl4sle04g pc : ssh -Y daquser@pxbl4sleo4gNOSPAMPLEASE.cern.ch / Password: BeamLine15
>
>
Login as a daquser to pcbl4sle04g pc : ssh -Y daquser@pcbl4sleo4gNOSPAMPLEASE.cern.ch (Password: BeamLine15)
 

1.BACKGROUND Data taking with webcam:

Revision 362015-09-07 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 344 to 344
  The program "BackgroundCreator.cpp" produce an output background image with "*.tif " extension by taking average of the all frames which are taken from webcam as background.
Changed:
<
<
cd /afs/cern.ch/user/d/daquser/public/Webcam/aug28/Analysis/BackgroundCreator
>
>
cd /afs/cern.ch/user/d/daquser/public/Webcam/Image_Analyis_withOpenCV/opencv/Analysis/BackgroundCreator
  To compile it (if you have made changes to the source code):

Line: 379 to 379
 "Substract.cpp" program has been developed to have a clear signal by subtracting calibration frame from each beam frames as it mentioned in previous section. it also count the number of pixels for each frames. We can analyze the subtracting images with "*Analysis.cpp*" program. This program gives us information about the number of pixels with the saturation value from 0 (black) to 255 (white) within .txt file.
Changed:
<
<
cd /afs/cern.ch/user/d/daquser/public/Webcam/aug28/Analysis
>
>
cd /afs/cern.ch/user/d/daquser/public/Webcam/Image_Analyis_withOpenCV/opencv/Analysis
  Both program can compile in a same way if you have made any changes in the codes :

Line: 416 to 416
 

Test with Radioactive Source

Changed:
<
<
Strontium radioctive source was used:
>
>
Strontium radioactive source was used:
 In order to estimate background several images were taken from webcam without any source and afterwards the experiment was repeated with using radioactive source.

Example background image:

Line: 425 to 425
  Analyze the Bacgrount Image with BG_Measurement.cpp program
Changed:
<
<
cd /afs/cern.ch/user/d/daquser/public/Webcam/Analysis
>
>
cd /afs/cern.ch/user/d/daquser/public/Webcam/Image_Analyis_withOpenCV/opencv/Analysis
  Attempt several Threshold value to find the ideal Threshold.
Line: 442 to 442
 
$name
Changed:
<
<
1- The first Timepix was removed. Radioactive source was put in front of the Webcam.
>
>
1.First Timepix was removed. Radioactive source was put in front of the Webcam.
  Example image with using radioactive source :


$name

Added:
>
>

Hardware

Dimensions

  The active area of the sensor appears to be 4mm x 3mm. Given the maximum resolution of 1280x960 pixels, the pixels are 3.125 microns square.
Line: 456 to 458
 

SHIFT INSTRUCTION

Changed:
<
<
Login as a daquser to bl4sdaq1 pc : ssh -Y daquser@bl4sdaq1NOSPAMPLEASE.cern.ch / Password: BeamLine15
>
>
Login as a daquser to pcbl4sle04g pc : ssh -Y daquser@pxbl4sleo4gNOSPAMPLEASE.cern.ch / Password: BeamLine15
 
Changed:
<
<
1.BACKGROUND DataTaking with webcam:
>
>
1.BACKGROUND Data taking with webcam:
  Flow chart:
Line: 521 to 523
  cd /afs/cern.ch/user/d/daquser/public/Webcam/Trigger/output_backgorund

Changed:
<
<
You will see at least 10000 frames will be saved.
>
>
You need to save at least 1000 frames for background.
  2.DATA Taking :
Line: 563 to 565
 Yo don't need to change any settings. Al settings must be saved as a default.

Changed:
<
<

Hardware

Dimensions

>
>
 

Contacts

Revision 352015-09-07 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 459 to 459
 Login as a daquser to bl4sdaq1 pc : ssh -Y daquser@bl4sdaq1NOSPAMPLEASE.cern.ch / Password: BeamLine15
Changed:
<
<
1.*Background Data Taking:*
>
>
1.BACKGROUND DataTaking with webcam:

Flow chart:


$name

  Go to "Trigger" directory:

cd /afs/cern.ch/user/d/daquser/public/Webcam/Trigger

Changed:
<
<
Create a new folder to write the background output files in it : (example: mkdir output_backgorund)
>
>
Create a new folder to write the background output files in it

mkdir output_backgorund

  Define the name of the output folder in Trigger.cpp code:
Line: 479 to 485
  To execute the Trigger.cpp :
Added:
>
>
STAGE 1:

  ./run.sh
Changed:
<
<
Flow chart:
>
>
STAGE 2:
 
Added:
>
>
After starting the bash script ("=./run.sh"= ) two windows will be appear on the screen. Look at the figure below.
 
Changed:
<
<

$name
>
>
Title of window 1 (W1): Trigger (on ) Title of window 2 (W2) : Webcam (on )


$name

The purpose of W1 is to control the settings of the WebCam.

STAGE 3:

You can, by moving the sliders on Track bar (W1) , change the threshold settings of the camera or in Trigger.cpp program. In the code:

int pixelValueThreshold = 150 ([changeable] ); ///each pixel value must be greater than 150 In order to choose active (white) pixels.-->we called it as "Dead Pixels"
int numOfPixelThreshold = 50; ([changeable]); /// at least 50 pixels must have pixel value threshold grater than 150.

STAGE4:

 
Added:
>
>
If the threshold setting is ok (Stage 3) then you will see the capturing images saved in the console as follows:
 
Changed:
<
<
2.Data Taking:
>
>

$name ===> NEEDED TO ATTACHED CONSOLE IMAGE!!!!!!
 
Changed:
<
<
Follow the all steps above.
>
>
STAGE 5:
 
Changed:
<
<
Only change the folder name to write the data output files in it. (example: mkdir output_data)
>
>
Go to the output_background folder :
 
Changed:
<
<
Execute the Trigger.cpp program again and run the bash script:
>
>
cd /afs/cern.ch/user/d/daquser/public/Webcam/Trigger/output_backgorund

You will see at least 10000 frames will be saved.

2.DATA Taking :

Flow Chart for data taking


$name

Note that : You only need to create new folder for data taking in Trigger directory again as we have done for Background Data Taking. (example: mkdir output_data)

Execute the "Trigger.cpp" program again and run the bash script:

  cmake .

make

Added:
>
>
STAGE1: Start Trigger:
  ./run.sh
Changed:
<
<
Flow Chart for data taking
>
>
Repeat all steps as we have done for Background Data Taking.
 
Changed:
<
<

$name
>
>
Start Timepix:

Go to "pixelman" directory:

cd /afs/cern.ch/user/d/daquser/public/Webcam/pixelman/Pixelman_2013_09_25_x64/

Execute "pixelman"

./pixelman.sh

The Timepix Control Panel will be appear :

 
Added:
>
>
Change the folder name to save the your output files Yo don't need to change any settings. Al settings must be saved as a default.
 

Hardware

Revision 342015-09-07 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 449 to 449
 
$name
Added:
>
>
The active area of the sensor appears to be 4mm x 3mm. Given the maximum resolution of 1280x960 pixels, the pixels are 3.125 microns square.

Mounting

To give flexibility with positioning, the plan is to mount the webcam rigidly between two TimePix sensors. All three can be aligned together and moved, if necessary, without disturbing the alignment.
 

SHIFT INSTRUCTION

Changed:
<
<
1.Background Data Taking
>
>
Login as a daquser to bl4sdaq1 pc : ssh -Y daquser@bl4sdaq1NOSPAMPLEASE.cern.ch / Password: BeamLine15

1.*Background Data Taking:*

Go to "Trigger" directory:

cd /afs/cern.ch/user/d/daquser/public/Webcam/Trigger

Create a new folder to write the background output files in it : (example: mkdir output_backgorund)

Define the name of the output folder in Trigger.cpp code:

sprintf(fileName, "output_background/frame_%d.tif", frameNumber);

After having all changes in the code we need to compile it :

cmake .

make

To execute the Trigger.cpp :

./run.sh

Flow chart:

 


$name

Added:
>
>
2.Data Taking:

Follow the all steps above.

Only change the folder name to write the data output files in it. (example: mkdir output_data)

Execute the Trigger.cpp program again and run the bash script:

cmake .

make

./run.sh

Flow Chart for data taking


$name

 

Hardware

Dimensions

Deleted:
<
<
The active area of the sensor appears to be 4mm x 3mm. Given the maximum resolution of 1280x960 pixels, the pixels are 3.125 microns square.

Mounting

To give flexibility with positioning, the plan is to mount the webcam rigidly between two TimePix sensors. All three can be aligned together and moved, if necessary, without disturbing the alignment.
 

Contacts

Line: 503 to 551
 
META FILEATTACHMENT attachment="BG_output.txt" attr="" comment="Contains information about BG images" date="1441151513" name="BG_output.txt" path="BG_output.txt" size="28735" user="cdozen" version="1"
META FILEATTACHMENT attachment="BG_without_source.png" attr="" comment="Histogram of BG image" date="1441152165" name="BG_without_source.png" path="BG_without_source.png" size="23554" user="cdozen" version="1"
META FILEATTACHMENT attachment="Background_Taking_Chart.png" attr="" comment="" date="1441573718" name="Background_Taking_Chart.png" path="Background_Taking_Chart.png" size="124834" user="cdozen" version="1"
Added:
>
>
META FILEATTACHMENT attachment="Data_Taking_Chart.png" attr="" comment="" date="1441582017" name="Data_Taking_Chart.png" path="Data_Taking_Chart.png" size="91194" user="cdozen" version="1"

Revision 332015-09-06 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 448 to 448
 
$name
Added:
>
>

SHIFT INSTRUCTION

1.Background Data Taking


$name

 

Hardware

Dimensions

The active area of the sensor appears to be 4mm x 3mm. Given the maximum resolution of 1280x960 pixels, the pixels are 3.125 microns square.
Line: 493 to 502
 
META FILEATTACHMENT attachment="background.png" attr="" comment="" date="1441144429" name="background.png" path="background.png" size="236409" user="cdozen" version="1"
META FILEATTACHMENT attachment="BG_output.txt" attr="" comment="Contains information about BG images" date="1441151513" name="BG_output.txt" path="BG_output.txt" size="28735" user="cdozen" version="1"
META FILEATTACHMENT attachment="BG_without_source.png" attr="" comment="Histogram of BG image" date="1441152165" name="BG_without_source.png" path="BG_without_source.png" size="23554" user="cdozen" version="1"
Added:
>
>
META FILEATTACHMENT attachment="Background_Taking_Chart.png" attr="" comment="" date="1441573718" name="Background_Taking_Chart.png" path="Background_Taking_Chart.png" size="124834" user="cdozen" version="1"

Revision 322015-09-06 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 266 to 266
  Leave it running for a while . Images (1 fps (or whatever exposure time you have set)) will be saved with ".tif " extensions in the "output" directory.
Added:
>
>
You can find the "Trigger "program written by opencv is in git repository: link to Git Repository for Triggering webcam with opencv
 Example of background image:


$name

Line: 374 to 376
  Subtraction and Analysis
Changed:
<
<
"*Substract.cpp*" program has been developed to have a clear signal by subtracting calibration frame from each beam frames as it mentioned in previous section. it also count the number of pixels for each frames.
>
>
"Substract.cpp" program has been developed to have a clear signal by subtracting calibration frame from each beam frames as it mentioned in previous section. it also count the number of pixels for each frames.
  We can analyze the subtracting images with "*Analysis.cpp*" program. This program gives us information about the number of pixels with the saturation value from 0 (black) to 255 (white) within .txt file.

cd /afs/cern.ch/user/d/daquser/public/Webcam/aug28/Analysis

Line: 410 to 412
 ###############################################################################################################
Added:
>
>
You can find the "Analysis "programs written by opencv is in git repository: link to Git Repository for Image Analysis with opencv
 

Test with Radioactive Source

Changed:
<
<
Strontium radiactive source was used:
>
>
Strontium radioctive source was used:
 In order to estimate background several images were taken from webcam without any source and afterwards the experiment was repeated with using radioactive source.

Example background image:

Revision 312015-09-02 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 262 to 262
 

The area below the sliders shows you the last frame that was captured by the camera. It shows all frames, not only those that are recorded.

Changed:
<
<
The second Window, W2 also shows the current frame with RGB channels.
>
>
The second Window, W2 also shows the current frame but only the frames that pass the trigger cut (threshold and active pixel numbers) will be saved.
  Leave it running for a while . Images (1 fps (or whatever exposure time you have set)) will be saved with ".tif " extensions in the "output" directory.

Revision 302015-09-02 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 419 to 419
 
$name
Added:
>
>
Analyze the Bacgrount Image with BG_Measurement.cpp program

cd /afs/cern.ch/user/d/daquser/public/Webcam/Analysis

Attempt several Threshold value to find the ideal Threshold.

execute the BG_Measurement.exe for 63 file with threshold value 25

./BG_Measurement.exe -n 63 -t 25 > BG_output.txt

"BG_output.txt" has information about image like: Dead Pixels and their coordinates, Pixel values above threshold and their coordinates, Number of Pixels with saturation values.

* BG_output.txt: Contains information about BG images

According to information from BG_output.txt file , the histogram of the background image seems as follows.


$name

 1- The first Timepix was removed. Radioactive source was put in front of the Webcam.

Example image with using radioactive source :

Line: 466 to 487
 
META FILEATTACHMENT attachment="frame_0.png" attr="" comment="" date="1441057380" name="frame_0.png" path="frame_0.png" size="54584" user="cdozen" version="1"
META FILEATTACHMENT attachment="str_webcam.jpg" attr="" comment="" date="1441143264" name="str_webcam.jpg" path="str_webcam.jpg" size="63537" user="cdozen" version="1"
META FILEATTACHMENT attachment="background.png" attr="" comment="" date="1441144429" name="background.png" path="background.png" size="236409" user="cdozen" version="1"
Added:
>
>
META FILEATTACHMENT attachment="BG_output.txt" attr="" comment="Contains information about BG images" date="1441151513" name="BG_output.txt" path="BG_output.txt" size="28735" user="cdozen" version="1"
META FILEATTACHMENT attachment="BG_without_source.png" attr="" comment="Histogram of BG image" date="1441152165" name="BG_without_source.png" path="BG_without_source.png" size="23554" user="cdozen" version="1"

Revision 292015-09-01 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 78 to 75
 

Software

Changed:
<
<
Read webcam and Analyse Images
>
>
Preview webcam with gstreamer:
> gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
Debugging can be enabled with an environment variable:
> GST_DEBUG=3 gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
Elements are connected with `!` characters, v4l2src supplies webcam data, timeoverlay takes image data and timestamps it, autovideosink creates a X11 window and renders the image data into it. Elements can be interrogated for their configuration, e.g.:
> gst-inspect v4l2src
...
Element Properties:
  name                : The name of the object
                        flags: readable, writable
                        String. Default: null Current: "v4l2src0"
  blocksize           : Size in bytes to read per buffer (-1 = default)
                        flags: readable, writable
                        Unsigned Long. Range: 0 - 18446744073709551615 Default: 4096 Current: 4096
  num-buffers         : Number of buffers to output before sending EOS (-1 = unlimited)
                        flags: readable, writable
                        Integer. Range: -1 - 2147483647 Default: -1 Current: -1
  typefind            : Run typefind before negotiating
                        flags: readable, writable
                        Boolean. Default: false Current: false
  do-timestamp        : Apply current stream time to buffers
                        flags: readable, writable
                        Boolean. Default: false Current: false
  device              : Device location
                        flags: readable, writable
                        String. Default: "/dev/video0" Current: "/dev/video0"
  device-name         : Name of the device
                        flags: readable
                        String. Default: null Current: "UVC Camera (046d:0825)"
  device-fd           : File descriptor of the device
                        flags: readable
                        Integer. Range: -1 - 2147483647 Default: -1 Current: -1
  flags               : Device type flags
                        flags: readable
                        Flags "GstV4l2DeviceTypeFlags" Default: 0x00000000, "(none)" Current: 0x00000000, "(none)"
                           (0x00000001): capture          - Device supports video capture
                           (0x00000002): output           - Device supports video playback
                           (0x00000004): overlay          - Device supports video overlay
                           (0x00000010): vbi-capture      - Device supports the VBI capture
                           (0x00000020): vbi-output       - Device supports the VBI output
                           (0x00010000): tuner            - Device has a tuner or modulator
                           (0x00020000): audio            - Device has audio inputs or outputs
  queue-size          : Number of buffers to be enqueud in the driver in streaming mode
                        flags: readable, writable
                        Unsigned Integer. Range: 1 - 16 Default: 2 Current: 2
  always-copy         : If the buffer will or not be used directly from mmap
                        flags: readable, writable
                        Boolean. Default: true Current: true

The camera format can be set like so:

> v4l2-ctl --set-fmt-video=width=1280,height=960,pixelformat=0

The pixel formats are listed with:

> v4l2-ctl --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
   Index       : 0
   Type        : Video Capture
   Pixel Format: 'YUYV'
   Name        : YUV 4:2:2 (YUYV)
...
      Size: Discrete 1280x960
         Interval: Discrete 0.133 s (7.500 fps)
         Interval: Discrete 0.200 s (5.000 fps)

   Index       : 1
   Type        : Video Capture
   Pixel Format: 'MJPG' (compressed)
   Name        : MJPEG
...
      Size: Discrete 1280x960
         Interval: Discrete 0.033 s (30.000 fps)
         Interval: Discrete 0.040 s (25.000 fps)
         Interval: Discrete 0.050 s (20.000 fps)
         Interval: Discrete 0.067 s (15.000 fps)
         Interval: Discrete 0.100 s (10.000 fps)
         Interval: Discrete 0.200 s (5.000 fps)

Image files can be created by encoding the data as png and writing it to a file per frame:

> gst-launch -v -e v4l2src ! pngenc snapshot=false ! multifilesink location="frame_%05d.png"

Camera properties can be set during data acquisition using v4l2-ctl. The configuration parameters of the Logitech C270 are as follows:

> v4l2-ctl --list-ctrls
                     brightness (int)    : min=0 max=255 step=1 default=128 value=225
                       contrast (int)    : min=0 max=255 step=1 default=32 value=128
                     saturation (int)    : min=0 max=255 step=1 default=32 value=32
 white_balance_temperature_auto (bool)   : default=1 value=0
                           gain (int)    : min=0 max=255 step=1 default=64 value=131
           power_line_frequency (menu)   : min=0 max=2 default=2 value=0
      white_balance_temperature (int)    : min=0 max=10000 step=10 default=4000 value=1070
                      sharpness (int)    : min=0 max=255 step=1 default=24 value=24
         backlight_compensation (int)    : min=0 max=1 step=1 default=0 value=0
                  exposure_auto (menu)   : min=0 max=3 default=3 value=1
              exposure_absolute (int)    : min=1 max=10000 step=1 default=166 value=10000
         exposure_auto_priority (bool)   : default=0 value=0

Check individual values with -C <property> e.g.:

> v4l2-ctl -C brightness
brightness: 225

Set values with -c <property> = value e.g.:

> v4l2-ctl -c exposure_absolute=10000

Properties

exposure_absolute is the shutter length in counts of $100\,\mathrm{\mu s}$. At 10,000 this corresponds to one frame per second.

exposure_auto must be disabled to prevent the camera changing exposure in response to signal levels. Value 3 is enabled, value 1 is disabled. Values 0 and 2 are not valid.

white_balance_temperature_auto should be disabled - set to 0.

power_line_frequency should be set to 0 to disable any compensation.

brightness can be set quite high while maintaining black levels. Around 240 seems to show hot pixels on the sensor.

gain may also play a factor - to be explored.

Example image

Here is an image from the webcam with the sensor covered and the gain and a brightness set to very high values. Hot pixels can be seen directly.
$name

Read Webcam and Analyze the image

  Log in the "pcbl4sleo4g" PC as a daquser.(examp:ssh -Y daquser@pcbl4sleo4g / password : BeamLine15)
Line: 99 to 219
  run.sh

 
Added:
>
>
###############################################################################################################
 #! /bin/bash # Switch off automation v4l2-ctl -c exposure_auto_priority=0
Line: 121 to 242
 v4l2-ctl -c saturation=32 v4l2-ctl -c white_balance_temperature=6500 ./Trigger
Added:
>
>
###############################################################################################################
 

After starting the bash script ("=./run.sh"= ) two windows will be appear on the screen.

Line: 128 to 250
 Title of window 1 (W1): Trigger (on ) Title of window 2 (W2) : Webcam (on )
Changed:
<
<

$name
>
>

$name
  The purpose of W1 is to control the settings of the WebCam. You can, by moving the sliders, change the settings of the camera. The parameters are defined as follows:
Line: 146 to 268
  Example of background image:
Changed:
<
<

$name
>
>

$name
 

Background Mesurement

Line: 160 to 282
  To compile it:
Changed:
<
<
./compile-root.sh BG_Measurement.cpp
>
>
./compile-root.sh BG_Measurement
  This will create the file BG_Measurement.exe.
Line: 170 to 292
  export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH source root/bin/thisroot.sh
Changed:
<
<
. The program BG_Measurement.cpp is command line driven. You have to specify the number of files which will be analyzed (e.g 150 ) and threshold need to be set (e.g 128).
>
>
The program BG_Measurement.cpp is command line driven. You have to specify the number of files which will be analyzed (e.g 150 ) and threshold need to be set (e.g 128).
  Example
Line: 194 to 317
 
  1. Pixels that flicker (i.e. change color from image to image)
In the "BW_SCALE histogram you can see, via the color grading, which pixels are dead (they should be red) and which are flickering (they are for example green or blue). This information will help you later to decide if a pixel has been hit by a particle.
Changed:
<
<

$name
>
>

$name
  The second histogram ("hvalue") shows on the x-axis the saturation of the pixels from 0 (black) to 255 (white). The value on the Y-axis is the number of pixels with that saturation.
Changed:
<
<

$name
>
>

$name
  Once you have identified the dead and flickering pixels of your CCD you should carefully archive the result. This file (*.tif format), lets call it the calibration frame (CF), will be required by the next step.

Imagine you take one frame with the WebCam exposed to beam. Let's call this the beam_frame (BF). In order to tell which pixels in the BF have been hit by the beam you have to filter out the malfunctioning pixels. Mathematically this means: Pixels hit by beam = BF - CF.

Changed:
<
<
Such a subtraction of two images can be done with the program Substract_images.cpp.
>
>
Such a subtraction of two images can be done with the program Subtract_images.cpp.
  Before you start Substract_images you have to execute:
  
Line: 212 to 335
  source root/bin/thisroot.sh
Deleted:
<
<
Work is ongoing!!!!
 
Changed:
<
<
*Second method can be used with "opencv" *
>
>

Second method for Analysis with "opencv" :

  Background Mesurement

The program "BackgroundCreator.cpp" produce an output background image with "*.tif " extension by taking average of the all frames which are taken from webcam as background.

Deleted:
<
<
Let's follow the steps as follows:

-Build the binaries:

  cd /afs/cern.ch/user/d/daquser/public/Webcam/aug28/Analysis/BackgroundCreator
Changed:
<
<
-compile it (if you have made changes to the source code):
>
>
To compile it (if you have made changes to the source code):
  cmake .

Line: 236 to 355
  run_background.sh
Added:
>
>
###############################################################################################################
 #!/bin/bash

BACKGROUNDS="../webcam_bg_output/"

Line: 246 to 366
 done

./BackgroundCreator $COMBINED out.tif

Added:
>
>
###############################################################################################################
 
Changed:
<
<
After starting the bash script ./run.sh "out.tif" file will be created.
>
>
After starting the bash script ./run_background.sh "out.tif" file will be created.
 Once we have this file we can go to the next step.
Changed:
<
<
In order to have a clear signal we must substract the "out.tif" file (CF:Calibration Frame) from the Beam Frame with "Substract.cpp" program.
>
>
Subtraction and Analysis

"*Substract.cpp*" program has been developed to have a clear signal by subtracting calibration frame from each beam frames as it mentioned in previous section. it also count the number of pixels for each frames. We can analyze the subtracting images with "*Analysis.cpp*" program. This program gives us information about the number of pixels with the saturation value from 0 (black) to 255 (white) within .txt file.

  cd /afs/cern.ch/user/d/daquser/public/Webcam/aug28/Analysis
Changed:
<
<
Bash script will produce 2 different files "*.tif " and "*.txt " . The number of "*tif* files which is produced must be the same number as Beam Frames
>
>
Both program can compile in a same way if you have made any changes in the codes :

cmake .

make

To execute these programs bash script "run.sh" was created.

After starting the script "=./run.sh=" 2 different files will be produced with "*.tif " extension files from Subtract.cpp and "*.txt " extension files from Analysis.cpp program.

 
Added:
>
>
###############################################################################################################
 #!/bin/bash

CURRENT_DIR=`pwd`

Changed:
<
<
OUTPUT="$CURRENT_DIR/output" CAPTURED="$CURRENT_DIR/webcam_data"
>
>
OUTPUT="$CURRENT_DIR/output" ////create an "output" folder to write the produced files. CAPTURED="$CURRENT_DIR/webcam_data" ////read beam frames from "webcam_data" folder
  SUBTRACT_THRESHOLD=128
Changed:
<
<
SUBTRACT_BACKGROUND="$CURRENT_DIR/BackgroundCreator/out.tif"
>
>
SUBTRACT_BACKGROUND="$CURRENT_DIR/BackgroundCreator/out.tif" ////read the calibration frame
 
Added:
>
>
////loop for subtract calibration frame("out.tif") from each beam frame.
 cd $CAPTURED for file in *.tif; do echo "processing $file..." $CURRENT_DIR/Subtract $file $SUBTRACT_BACKGROUND $SUBTRACT_THRESHOLD "$OUTPUT/$file.subtracted.tif"
Changed:
<
<
$CURRENT_DIR/Analysis "$OUTPUT/$file.subtracted.tif" > "$OUTPUT/$file.txt"
>
>
$CURRENT_DIR/Analysis "$OUTPUT/$file.subtracted.tif" > "$OUTPUT/$file.txt" ////create a .txt files for each subtracting images
 done
Changed:
<
<
>
>
###############################################################################################################
 
Added:
>
>

Test with Radioactive Source

 
Added:
>
>
Strontium radiactive source was used: In order to estimate background several images were taken from webcam without any source and afterwards the experiment was repeated with using radioactive source.
 
Added:
>
>
Example background image:
 
Added:
>
>

$name
 
Added:
>
>
1- The first Timepix was removed. Radioactive source was put in front of the Webcam.
 
Added:
>
>
Example image with using radioactive source :
 
Changed:
<
<

Preview webcam with gstreamer:

> gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
Debugging can be enabled with an environment variable:
> GST_DEBUG=3 gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
Elements are connected with `!` characters, v4l2src supplies webcam data, timeoverlay takes image data and timestamps it, autovideosink creates a X11 window and renders the image data into it. Elements can be interrogated for their configuration, e.g.:
> gst-inspect v4l2src
...
Element Properties:
  name                : The name of the object
                        flags: readable, writable
                        String. Default: null Current: "v4l2src0"
  blocksize           : Size in bytes to read per buffer (-1 = default)
                        flags: readable, writable
                        Unsigned Long. Range: 0 - 18446744073709551615 Default: 4096 Current: 4096
  num-buffers         : Number of buffers to output before sending EOS (-1 = unlimited)
                        flags: readable, writable
                        Integer. Range: -1 - 2147483647 Default: -1 Current: -1
  typefind            : Run typefind before negotiating
                        flags: readable, writable
                        Boolean. Default: false Current: false
  do-timestamp        : Apply current stream time to buffers
                        flags: readable, writable
                        Boolean. Default: false Current: false
  device              : Device location
                        flags: readable, writable
                        String. Default: "/dev/video0" Current: "/dev/video0"
  device-name         : Name of the device
                        flags: readable
                        String. Default: null Current: "UVC Camera (046d:0825)"
  device-fd           : File descriptor of the device
                        flags: readable
                        Integer. Range: -1 - 2147483647 Default: -1 Current: -1
  flags               : Device type flags
                        flags: readable
                        Flags "GstV4l2DeviceTypeFlags" Default: 0x00000000, "(none)" Current: 0x00000000, "(none)"
                           (0x00000001): capture          - Device supports video capture
                           (0x00000002): output           - Device supports video playback
                           (0x00000004): overlay          - Device supports video overlay
                           (0x00000010): vbi-capture      - Device supports the VBI capture
                           (0x00000020): vbi-output       - Device supports the VBI output
                           (0x00010000): tuner            - Device has a tuner or modulator
                           (0x00020000): audio            - Device has audio inputs or outputs
  queue-size          : Number of buffers to be enqueud in the driver in streaming mode
                        flags: readable, writable
                        Unsigned Integer. Range: 1 - 16 Default: 2 Current: 2
  always-copy         : If the buffer will or not be used directly from mmap
                        flags: readable, writable
                        Boolean. Default: true Current: true

The camera format can be set like so:

> v4l2-ctl --set-fmt-video=width=1280,height=960,pixelformat=0

The pixel formats are listed with:

> v4l2-ctl --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
   Index       : 0
   Type        : Video Capture
   Pixel Format: 'YUYV'
   Name        : YUV 4:2:2 (YUYV)
...
      Size: Discrete 1280x960
         Interval: Discrete 0.133 s (7.500 fps)
         Interval: Discrete 0.200 s (5.000 fps)

   Index       : 1
   Type        : Video Capture
   Pixel Format: 'MJPG' (compressed)
   Name        : MJPEG
...
      Size: Discrete 1280x960
         Interval: Discrete 0.033 s (30.000 fps)
         Interval: Discrete 0.040 s (25.000 fps)
         Interval: Discrete 0.050 s (20.000 fps)
         Interval: Discrete 0.067 s (15.000 fps)
         Interval: Discrete 0.100 s (10.000 fps)
         Interval: Discrete 0.200 s (5.000 fps)

Image files can be created by encoding the data as png and writing it to a file per frame:

> gst-launch -v -e v4l2src ! pngenc snapshot=false ! multifilesink location="frame_%05d.png"

Camera properties can be set during data acquisition using v4l2-ctl. The configuration parameters of the Logitech C270 are as follows:

> v4l2-ctl --list-ctrls
                     brightness (int)    : min=0 max=255 step=1 default=128 value=225
                       contrast (int)    : min=0 max=255 step=1 default=32 value=128
                     saturation (int)    : min=0 max=255 step=1 default=32 value=32
 white_balance_temperature_auto (bool)   : default=1 value=0
                           gain (int)    : min=0 max=255 step=1 default=64 value=131
           power_line_frequency (menu)   : min=0 max=2 default=2 value=0
      white_balance_temperature (int)    : min=0 max=10000 step=10 default=4000 value=1070
                      sharpness (int)    : min=0 max=255 step=1 default=24 value=24
         backlight_compensation (int)    : min=0 max=1 step=1 default=0 value=0
                  exposure_auto (menu)   : min=0 max=3 default=3 value=1
              exposure_absolute (int)    : min=1 max=10000 step=1 default=166 value=10000
         exposure_auto_priority (bool)   : default=0 value=0

Check individual values with -C <property> e.g.:

> v4l2-ctl -C brightness
brightness: 225

Set values with -c <property> = value e.g.:

> v4l2-ctl -c exposure_absolute=10000

Properties

exposure_absolute is the shutter length in counts of $100\,\mathrm{\mu s}$. At 10,000 this corresponds to one frame per second.

exposure_auto must be disabled to prevent the camera changing exposure in response to signal levels. Value 3 is enabled, value 1 is disabled. Values 0 and 2 are not valid.

white_balance_temperature_auto should be disabled - set to 0.

power_line_frequency should be set to 0 to disable any compensation.

brightness can be set quite high while maintaining black levels. Around 240 seems to show hot pixels on the sensor.

gain may also play a factor - to be explored.

Example image

Here is an image from the webcam with the sensor covered and the gain and a brightness set to very high values. Hot pixels can be seen directly.
$name

Read Webcam and Analyse the image

>
>

$name
 

Hardware

Dimensions

Line: 454 to 464
 
META FILEATTACHMENT attachment="hvalue.png" attr="" comment="" date="1441016203" name="hvalue.png" path="hvalue.png" size="10954" user="cdozen" version="1"
META FILEATTACHMENT attachment="BW_SCALE.png" attr="" comment="" date="1441016313" name="BW_SCALE.png" path="BW_SCALE.png" size="14569" user="cdozen" version="1"
META FILEATTACHMENT attachment="frame_0.png" attr="" comment="" date="1441057380" name="frame_0.png" path="frame_0.png" size="54584" user="cdozen" version="1"
Added:
>
>
META FILEATTACHMENT attachment="str_webcam.jpg" attr="" comment="" date="1441143264" name="str_webcam.jpg" path="str_webcam.jpg" size="63537" user="cdozen" version="1"
META FILEATTACHMENT attachment="background.png" attr="" comment="" date="1441144429" name="background.png" path="background.png" size="236409" user="cdozen" version="1"

Revision 282015-09-01 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 136 to 136
 
Threshold (Pixel Value) A pixel can have a saturation from 0 (black) to 255 (white). With this threshold you set the limit above which a pixel will be saved in the output file. With e.g. threshold set to 150 a pixel that has a saturation of e.g. 100 will not be recorded. The purpose of this parameter is to allow you to reduce the amount of noise in the picture.
Num of active Pixel This is not a control parameter but the number of pixels above threshold in the current frame.
Threshold (num of pixel) A picture will only be recorded if the number of pixels above threshold (see first parameter) is larger than this number. If this parameter is set to 50 and a frame has only e.g. 39 pixels above threshold no file will be recorded. This parameter also helps you to suppress background
Changed:
<
<
Exposure This is the exposure time of a frame in ms. To be debugged
>
>
Exposure This is the exposure time of a frame in s.
 

The area below the sliders shows you the last frame that was captured by the camera. It shows all frames, not only those that are recorded.

Changed:
<
<
The second Window, W2 also shows the current frame but ........
>
>
The second Window, W2 also shows the current frame with RGB channels.
  Leave it running for a while . Images (1 fps (or whatever exposure time you have set)) will be saved with ".tif " extensions in the "output" directory.
Line: 214 to 214
  Work is ongoing!!!!

Added:
>
>
*Second method can be used with "opencv" *

Background Mesurement

The program "BackgroundCreator.cpp" produce an output background image with "*.tif " extension by taking average of the all frames which are taken from webcam as background. Let's follow the steps as follows:

-Build the binaries:

cd /afs/cern.ch/user/d/daquser/public/Webcam/aug28/Analysis/BackgroundCreator

-compile it (if you have made changes to the source code):

cmake .

make

To reads each background images from the output directory "run_background.sh" script is created.

run_background.sh

#!/bin/bash

BACKGROUNDS="../webcam_bg_output/"
COMBINED=""

for file in "$BACKGROUNDS/*.tiff"; do
  COMBINED="$COMBINED $file"
done

./BackgroundCreator $COMBINED out.tif

After starting the bash script ./run.sh "out.tif" file will be created. Once we have this file we can go to the next step.

In order to have a clear signal we must substract the "out.tif" file (CF:Calibration Frame) from the Beam Frame with "Substract.cpp" program.

cd /afs/cern.ch/user/d/daquser/public/Webcam/aug28/Analysis

Bash script will produce 2 different files "*.tif " and "*.txt " . The number of "*tif* files which is produced must be the same number as Beam Frames

#!/bin/bash

CURRENT_DIR=`pwd`
OUTPUT="$CURRENT_DIR/output"
CAPTURED="$CURRENT_DIR/webcam_data"

SUBTRACT_THRESHOLD=128
SUBTRACT_BACKGROUND="$CURRENT_DIR/BackgroundCreator/out.tif"


cd $CAPTURED
for file in *.tif; do
    echo "processing $file..."
    $CURRENT_DIR/Subtract $file $SUBTRACT_BACKGROUND $SUBTRACT_THRESHOLD "$OUTPUT/$file.subtracted.tif"
    $CURRENT_DIR/Analysis "$OUTPUT/$file.subtracted.tif" > "$OUTPUT/$file.txt"
done

 

Revision 272015-08-31 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 144 to 144
  Leave it running for a while . Images (1 fps (or whatever exposure time you have set)) will be saved with ".tif " extensions in the "output" directory.

Changed:
<
<
To analyse the images
>
>
Example of background image:
 
Changed:
<
<
The program "Analysis2.cpp" has been developed to analyse the images taken by the webcam. It uses the Root analysis framework for the generation of histograms.
>
>

$name
 
Deleted:
<
<
Let's firs build the binaries:
 
Changed:
<
<
cd /afs/cern.ch/user/d/daquser/public/Webcam/
>
>
Background Mesurement

The program "BG_Measurement.cpp" has been developed to analyse the images taken by the webcam. It uses the Root analysis framework for the generation of histograms.

Let's first build the binaries:

cd /afs/cern.ch/user/d/daquser/public/Webcam/Analysis/

  To compile it:
Changed:
<
<
./compile-root.sh Analysis2
>
>
./compile-root.sh BG_Measurement.cpp
 
Changed:
<
<
This will create the file Analysis2.exe.
>
>
This will create the file BG_Measurement.exe.
  Before using the application and Root you must set up your environment. Execute these commands:
Line: 164 to 170
  export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH source root/bin/thisroot.sh
Changed:
<
<
The program Analysis2.exe is command line driven. You have to specify the name(s) of at least on *.tif frame.
>
>
. The program BG_Measurement.cpp is command line driven. You have to specify the number of files which will be analyzed (e.g 150 ) and threshold need to be set (e.g 128).
  Example
Changed:
<
<
./Analysis2.exe frame_20.tif frame_21.tif
>
>
   ./BG_Measurement.exe  -n 150 -t 128

The program will read the number of files and threshold specified in the command line. it will collect and sum all background images from "output " directory (afs/cern.ch/user/d/daquser/public/Webcam/Trigger/output) in order to produce a single background image.

Finally it will generate two files with the name background.root and BG_Total.tif. Please save these files immediately in your data area because it will be overwritten when you execute BG_Measurement the next time. You can display the file (under Linux) with the command "display BG_Total.tif " .

If you would like to browse the output root file type the following commands in terminal ;

"root -l background.root " --->

it allows us to load root file
 
Changed:
<
<
The program will read the images specified in the command line. .....
>
>
   "new TBrowser "  --->
it opens a browser to see the histograms.
 
Changed:
<
<
Finally it will generate a file with the name result.png. Please save this file immediately in your data area because it will be overwritten when you execute Analysis2 the next time. You can display the file (under Linux) with the command "display result.png" On the displayed image you will find two histograms. The first one has the title "BW_SCALE". The X axis represents the horizontal pixels of the WebCam and the Y-axis the vertical pixels. Therefore each dot in the histogram represents one pixel. The color of the pixel in the histogram tells you how often the respective pixel was above threshold in the input images.
>
>
Also, on the displayed image you will find two histograms. The first one has the title "BW_SCALE". The X axis represents the horizontal pixels of the WebCam and the Y-axis the vertical pixels. Therefore each dot in the histogram represents one pixel. The color of the pixel in the histogram tells you how often the respective pixel was above threshold in the input images.
 The purpose is to detect pixels that are not working properly If you put the WebCam into a dark box and record images, all images should ideally be black. A CCD may have two types of dead pixels:
  1. Pixels that are always white (or at least not black)
Line: 372 to 385
 
META FILEATTACHMENT attachment="Webcam.png" attr="" comment="" date="1439466175" name="Webcam.png" path="Webcam.png" size="127120" user="cdozen" version="1"
META FILEATTACHMENT attachment="hvalue.png" attr="" comment="" date="1441016203" name="hvalue.png" path="hvalue.png" size="10954" user="cdozen" version="1"
META FILEATTACHMENT attachment="BW_SCALE.png" attr="" comment="" date="1441016313" name="BW_SCALE.png" path="BW_SCALE.png" size="14569" user="cdozen" version="1"
Added:
>
>
META FILEATTACHMENT attachment="frame_0.png" attr="" comment="" date="1441057380" name="frame_0.png" path="frame_0.png" size="54584" user="cdozen" version="1"

Revision 262015-08-31 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 181 to 181
 
  1. Pixels that flicker (i.e. change color from image to image)
In the "BW_SCALE histogram you can see, via the color grading, which pixels are dead (they should be red) and which are flickering (they are for example green or blue). This information will help you later to decide if a pixel has been hit by a particle.
Changed:
<
<
The second histogram ("hvalue") shows on the x-axis the saturation of the pixels from 0 (black) to 255 (white). The value on the Y-axis is the number of pixels with that saturation.
>
>

$name
 
Added:
>
>
The second histogram ("hvalue") shows on the x-axis the saturation of the pixels from 0 (black) to 255 (white). The value on the Y-axis is the number of pixels with that saturation.
$name
  Once you have identified the dead and flickering pixels of your CCD you should carefully archive the result. This file (*.tif format), lets call it the calibration frame (CF), will be required by the next step.
Line: 364 to 366
  -- TimBrooks - 2015-06-03
Added:
>
>
 
META FILEATTACHMENT attachment="webcam_example.png" attr="" comment="Webcam high gain image" date="1434724199" name="webcam_example.png" path="webcam_example.png" size="19038" user="brooks" version="1"
META FILEATTACHMENT attachment="Webcam.png" attr="" comment="" date="1439466175" name="Webcam.png" path="Webcam.png" size="127120" user="cdozen" version="1"
Added:
>
>
META FILEATTACHMENT attachment="hvalue.png" attr="" comment="" date="1441016203" name="hvalue.png" path="hvalue.png" size="10954" user="cdozen" version="1"
META FILEATTACHMENT attachment="BW_SCALE.png" attr="" comment="" date="1441016313" name="BW_SCALE.png" path="BW_SCALE.png" size="14569" user="cdozen" version="1"

Revision 252015-08-13 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 128 to 128
 Title of window 1 (W1): Trigger (on ) Title of window 2 (W2) : Webcam (on )
Added:
>
>

$name
 The purpose of W1 is to control the settings of the WebCam. You can, by moving the sliders, change the settings of the camera. The parameters are defined as follows:

Parameter Description
Line: 363 to 365
 -- TimBrooks - 2015-06-03

META FILEATTACHMENT attachment="webcam_example.png" attr="" comment="Webcam high gain image" date="1434724199" name="webcam_example.png" path="webcam_example.png" size="19038" user="brooks" version="1"
Added:
>
>
META FILEATTACHMENT attachment="Webcam.png" attr="" comment="" date="1439466175" name="Webcam.png" path="Webcam.png" size="127120" user="cdozen" version="1"

Revision 242015-08-11 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 158 to 158
  Before using the application and Root you must set up your environment. Execute these commands:
Changed:
<
<
>
>
  export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH source root/bin/thisroot.sh
Changed:
<
<
>
>
 The program Analysis2.exe is command line driven. You have to specify the name(s) of at least on *.tif frame.
Added:
>
>
 Example
Changed:
<
<
./Analysis2.exe frame_20.tif frame_21.tif
>
>
./Analysis2.exe frame_20.tif frame_21.tif
  The program will read the images specified in the command line. .....
Line: 188 to 190
 Such a subtraction of two images can be done with the program Substract_images.cpp.

Before you start Substract_images you have to execute:

Added:
>
>
  
  export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH source root/bin/thisroot.sh
Added:
>
>
  Work is ongoing!!!!

Revision 232015-08-11 - MarkusJoos

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 134 to 134
 
Threshold (Pixel Value) A pixel can have a saturation from 0 (black) to 255 (white). With this threshold you set the limit above which a pixel will be saved in the output file. With e.g. threshold set to 150 a pixel that has a saturation of e.g. 100 will not be recorded. The purpose of this parameter is to allow you to reduce the amount of noise in the picture.
Num of active Pixel This is not a control parameter but the number of pixels above threshold in the current frame.
Threshold (num of pixel) A picture will only be recorded if the number of pixels above threshold (see first parameter) is larger than this number. If this parameter is set to 50 and a frame has only e.g. 39 pixels above threshold no file will be recorded. This parameter also helps you to suppress background
Changed:
<
<
Exposure This is the exposure time of a frame in ms. To be dubugged
>
>
Exposure This is the exposure time of a frame in ms. To be debugged
 
Changed:
<
<
One of them is with Trackbar to controll the threshold by hand and it counts the the active pixels. Leave it running for a while . Images (1 fps) will be saved with ".tif " extensions in the "output" directory.
>
>
The area below the sliders shows you the last frame that was captured by the camera. It shows all frames, not only those that are recorded. The second Window, W2 also shows the current frame but ........
 
Changed:
<
<
To analyse the images
>
>
Leave it running for a while . Images (1 fps (or whatever exposure time you have set)) will be saved with ".tif " extensions in the "output" directory.
 
Changed:
<
<
"Analysis2.cpp" created to analyse the images taken from webcam. It works with root to have the histograms. Environmental settings must be done for root.)
>
>
To analyse the images
 
Changed:
<
<
/afs/cern.ch/user/d/daquser/public/Webcam/
>
>
The program "Analysis2.cpp" has been developed to analyse the images taken by the webcam. It uses the Root analysis framework for the generation of histograms.
 
Changed:
<
<
To compile it:
>
>
Let's firs build the binaries:
 
Changed:
<
<
compile-root.sh Analysis2
>
>
cd /afs/cern.ch/user/d/daquser/public/Webcam/
 
Changed:
<
<
After compiling Analysis2.exe file will be created.
>
>
To compile it:
 
Changed:
<
<
Add several frames (for example 10 frames) to Analysis2.cpp code.
>
>
./compile-root.sh Analysis2
 
Changed:
<
<
Run the Analysis2.cpp with bash script:
>
>
This will create the file Analysis2.exe.
 
Changed:
<
<
./run_Analysis_image.sh
>
>
Before using the application and Root you must set up your environment. Execute these commands:
 
Deleted:
<
<
run_Analysis_image.sh
  
#!/bin/bash
 export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH source root/bin/thisroot.sh
Deleted:
<
<
./Analysis2.exe
 
Added:
>
>
The program Analysis2.exe is command line driven. You have to specify the name(s) of at least on *.tif frame. Example ./Analysis2.exe frame_20.tif frame_21.tif
 
Changed:
<
<
To substract two images to reduce the white hits , Substract_images.cpp code created.
>
>
The program will read the images specified in the command line. .....
 
Changed:
<
<
It can run as same as Analysis2.cpp code.
>
>
Finally it will generate a file with the name result.png. Please save this file immediately in your data area because it will be overwritten when you execute Analysis2 the next time. You can display the file (under Linux) with the command "display result.png" On the displayed image you will find two histograms. The first one has the title "BW_SCALE". The X axis represents the horizontal pixels of the WebCam and the Y-axis the vertical pixels. Therefore each dot in the histogram represents one pixel. The color of the pixel in the histogram tells you how often the respective pixel was above threshold in the input images. The purpose is to detect pixels that are not working properly If you put the WebCam into a dark box and record images, all images should ideally be black. A CCD may have two types of dead pixels:
  1. Pixels that are always white (or at least not black)
  2. Pixels that flicker (i.e. change color from image to image)
In the "BW_SCALE histogram you can see, via the color grading, which pixels are dead (they should be red) and which are flickering (they are for example green or blue). This information will help you later to decide if a pixel has been hit by a particle.
 
Added:
>
>
The second histogram ("hvalue") shows on the x-axis the saturation of the pixels from 0 (black) to 255 (white). The value on the Y-axis is the number of pixels with that saturation.

Once you have identified the dead and flickering pixels of your CCD you should carefully archive the result. This file (*.tif format), lets call it the calibration frame (CF), will be required by the next step.

Imagine you take one frame with the WebCam exposed to beam. Let's call this the beam_frame (BF). In order to tell which pixels in the BF have been hit by the beam you have to filter out the malfunctioning pixels. Mathematically this means: Pixels hit by beam = BF - CF.

Such a subtraction of two images can be done with the program Substract_images.cpp.

Before you start Substract_images you have to execute: export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH source root/bin/thisroot.sh

  Work is ongoing!!!!

Revision 222015-08-11 - MarkusJoos

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 88 to 88
  /afs/cern.ch/user/d/daquser/public/Webcam/Trigger

Changed:
<
<
To compile it:
>
>
To compile it (if you have made changes to the source code):
  cmake .
Line: 123 to 123
 ./Trigger
Changed:
<
<
After run the bash script ("=./run.sh"= ) two windows will be appear on the screen. One of them is with Trackbar to controll the threshold by hand and it counts the the active pixels. Leave it running for a while . Images (1 fps) will be saved with ".tif " extensions in the "output" directory.
>
>
After starting the bash script ("=./run.sh"= ) two windows will be appear on the screen.

Title of window 1 (W1): Trigger (on ) Title of window 2 (W2) : Webcam (on )

The purpose of W1 is to control the settings of the WebCam. You can, by moving the sliders, change the settings of the camera. The parameters are defined as follows:

Parameter Description
Threshold (Pixel Value) A pixel can have a saturation from 0 (black) to 255 (white). With this threshold you set the limit above which a pixel will be saved in the output file. With e.g. threshold set to 150 a pixel that has a saturation of e.g. 100 will not be recorded. The purpose of this parameter is to allow you to reduce the amount of noise in the picture.
Num of active Pixel This is not a control parameter but the number of pixels above threshold in the current frame.
Threshold (num of pixel) A picture will only be recorded if the number of pixels above threshold (see first parameter) is larger than this number. If this parameter is set to 50 and a frame has only e.g. 39 pixels above threshold no file will be recorded. This parameter also helps you to suppress background
Exposure This is the exposure time of a frame in ms. To be dubugged

One of them is with Trackbar to controll the threshold by hand and it counts the the active pixels. Leave it running for a while . Images (1 fps) will be saved with ".tif " extensions in the "output" directory.

  To analyse the images

Revision 212015-08-11 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 82 to 82
  Log in the "pcbl4sleo4g" PC as a daquser.(examp:ssh -Y daquser@pcbl4sleo4g / password : BeamLine15)
Changed:
<
<
Go to "Webcam4L_bl4s" directory.
>
>
Go to "Webcam" directory.
  In order to control the webcam go to the "Trigger" directory
Changed:
<
<
/afs/cern.ch/user/d/daquser/Webcam4L_bl4s/Trigger
>
>
/afs/cern.ch/user/d/daquser/public/Webcam/Trigger
  To compile it:
Line: 123 to 123
 ./Trigger
Changed:
<
<
After run the bash script "=./run.sh"= 2 windows will be appear on the screen. One of them is with Trackbar to controll the threshold by hand and it counts the the active pixels. Leave it running for a while . Images (1 fps) will be saved with ".tif " extensions in the "output" directory.
>
>
After run the bash script ("=./run.sh"= ) two windows will be appear on the screen. One of them is with Trackbar to controll the threshold by hand and it counts the the active pixels. Leave it running for a while . Images (1 fps) will be saved with ".tif " extensions in the "output" directory.
  To analyse the images
Changed:
<
<
Go to the "Analysis" directory. It works with root. Environmental settings must be done for root.)
>
>
"Analysis2.cpp" created to analyse the images taken from webcam. It works with root to have the histograms. Environmental settings must be done for root.)
 
Changed:
<
<
/afs/cern.ch/user/d/daquser/Webcam4L_bl4s/Analysis
>
>
/afs/cern.ch/user/d/daquser/public/Webcam/
  To compile it:
Line: 139 to 139
  Add several frames (for example 10 frames) to Analysis2.cpp code.

Changed:
<
<
Run the Analysis2.cpp with bash script which is created.
>
>
Run the Analysis2.cpp with bash script:
  ./run_Analysis_image.sh
Line: 152 to 152
 
Added:
>
>
To substract two images to reduce the white hits , Substract_images.cpp code created.

It can run as same as Analysis2.cpp code.

 Work is ongoing!!!!

Revision 202015-08-10 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 77 to 77
 

Software

Added:
>
>
Read webcam and Analyse Images

Log in the "pcbl4sleo4g" PC as a daquser.(examp:ssh -Y daquser@pcbl4sleo4g / password : BeamLine15)

Go to "Webcam4L_bl4s" directory.

In order to control the webcam go to the "Trigger" directory

/afs/cern.ch/user/d/daquser/Webcam4L_bl4s/Trigger

To compile it:

cmake .

make

Camera properties are set in the "run.sh" script is created.

run.sh

 
#! /bin/bash
# Switch off automation
v4l2-ctl -c exposure_auto_priority=0
v4l2-ctl -c exposure_auto=1
v4l2-ctl -c backlight_compensation=0
v4l2-ctl -c white_balance_temperature_auto=0
v4l2-ctl -c power_line_frequency=0

# Set shutter to 1 second
v4l2-ctl -c exposure_absolute=10000
# Amplify as much as possible
v4l2-ctl -c gain=255
v4l2-ctl -c contrast=255
v4l2-ctl -c brightness=242

v4l2-ctl --set-fmt-video=width=1280,height=960,pixelformat=0
#v4l2-ctl -c --set-param=1

# Non-critical
v4l2-ctl -c saturation=32
v4l2-ctl -c white_balance_temperature=6500
./Trigger

After run the bash script "=./run.sh"= 2 windows will be appear on the screen. One of them is with Trackbar to controll the threshold by hand and it counts the the active pixels. Leave it running for a while . Images (1 fps) will be saved with ".tif " extensions in the "output" directory.

To analyse the images

Go to the "Analysis" directory. It works with root. Environmental settings must be done for root.)

/afs/cern.ch/user/d/daquser/Webcam4L_bl4s/Analysis

To compile it:

compile-root.sh Analysis2

After compiling Analysis2.exe file will be created.

Add several frames (for example 10 frames) to Analysis2.cpp code.

Run the Analysis2.cpp with bash script which is created.

./run_Analysis_image.sh

run_Analysis_image.sh

  
#!/bin/bash
export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH
source root/bin/thisroot.sh
./Analysis2.exe

Work is ongoing!!!!

 Preview webcam with gstreamer:
> gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
Debugging can be enabled with an environment variable:
Line: 199 to 283
 Here is an image from the webcam with the sensor covered and the gain and a brightness set to very high values. Hot pixels can be seen directly.
$name
Added:
>
>

Read Webcam and Analyse the image

 

Hardware

Dimensions

The active area of the sensor appears to be 4mm x 3mm. Given the maximum resolution of 1280x960 pixels, the pixels are 3.125 microns square.
Line: 221 to 309
 
Look for Linux compatible S/W for reading single images via USB from the camera Candan to be started
Understanding the camera Candan Question: can we control the exposure time? If necessary Markus will contact Logitech about technical support
Added:
>
>

List Of Equipments

  1. 3 Logitech C270 Webcam.
  2. 2 Timepix with FITPIX.
  3. 5 SATA Hard Disk
  4. 1 USB Hub (7-Port Powered Mobile Hub) and cables

 -- TimBrooks - 2015-06-03

META FILEATTACHMENT attachment="webcam_example.png" attr="" comment="Webcam high gain image" date="1434724199" name="webcam_example.png" path="webcam_example.png" size="19038" user="brooks" version="1"

Revision 192015-08-10 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 14 to 14
 

Preparation in the lab

  1. Install one Webcam and Two Timepix in the sandwich structure
  2. Buy a USB hub (don't wait, do it now) :
Changed:
<
<
  1. Connect the Webcam and the Timepix to the USB hub
  2. Connect the hub to the Leo4G DAQ PC (this is a retired GEN-II ROS with CC7)
  3. Install PixelMan on the PC as well as the S/W that is required to read the Webcam
>
>
  1. Connect the Webcam and the Timepix to the USB hub :
  2. Connect the hub to the Leo4G DAQ PC (this is a retired GEN-II ROS with CC7):
  3. Install PixelMan on the PC as well as the S/W that is required to read the Webcam:
 
  1. Set up a script that allows acquiring images from TimePix and Webcam at the same time
Added:
>
>
  1. * Webcam modified by removing the glass protective filter on the sensor and installed in the sandwich structure with two Timepix.
  2. * 7-Port Powered Mobile Hub bought.
  3. * The connection chain is completed:Webcam & Timepix -> USB hub ->Leo4G DAQ PC
  4. * PixelMan and OpenCV installed for Timepix and Webcam respectively.

 

Background measurement

  1. Tape the lens of the Webcam to keep ambient light outside
  2. Acquire a few images. Identify:

Revision 182015-07-20 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 13 to 13
 

Preparation in the lab

  1. Install one Webcam and Two Timepix in the sandwich structure
Changed:
<
<
  1. Buy a USB hub (don't wait, do it now)
>
>
  1. Buy a USB hub (don't wait, do it now) :
 
  1. Connect the Webcam and the Timepix to the USB hub
  2. Connect the hub to the Leo4G DAQ PC (this is a retired GEN-II ROS with CC7)
  3. Install PixelMan on the PC as well as the S/W that is required to read the Webcam
Line: 59 to 59
 

Development

Date Action
Added:
>
>
2015-07-20 Opencv installed on pcbl4sleo4g PC.
2015-07-16 Centos 7 installed in new hard disk. PC name is: pcbl4sleo4g.
 
2015-06-23 Managed to disable MJPEG compression on camera
2015-06-16 Vide for Linux (V4L2) installed on bl4sdaq/blrsdaq1 PC for controlling webcam from Linux.
yum install v4l-utils
2015-06-15 Pixelman s/w installed on centos 7 , connected TimePix device to PC.

Revision 172015-07-17 - MarkusJoos

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 6 to 6
 
Added:
>
>

Work plan for the execution of the experiment

Below is a list of steps that should be executed during the beam tie in September. NOTE: This is just the vision of Markus. Input from other people is required and milestones have to be formulated

Preparation in the lab

  1. Install one Webcam and Two Timepix in the sandwich structure
  2. Buy a USB hub (don't wait, do it now)
  3. Connect the Webcam and the Timepix to the USB hub
  4. Connect the hub to the Leo4G DAQ PC (this is a retired GEN-II ROS with CC7)
  5. Install PixelMan on the PC as well as the S/W that is required to read the Webcam
  6. Set up a script that allows acquiring images from TimePix and Webcam at the same time

Background measurement

  1. Tape the lens of the Webcam to keep ambient light outside
  2. Acquire a few images. Identify:
    1. Pixels that are never black
    2. Pixels that are sometimes not black
    3. Save the coordinates of these pixels in a calibration file.
  3. Repeat the process above at different temperatures. Use dry ice or the Pelletier cooler to cool the Webcam
  4. Decide if the Webcam needs cooling in T9

Alignment

  1. Method 1:
    1. Acquire data will all 3 sensors until the Timepix has seen a few cosmic particles
    2. Look for (cosmic) particles that crossed the two Timepix at ~the same time
    3. Estimate where the particle seen by the Timepix should have crossed the Webcam
    4. Check if the Webcam has a white pixel in the expected region
  2. Method 2: (Maybe to be executed before method 1)
    1. Put a radioactive source in front of the Webcam (remove the first Timepix)
    2. The radiation has to cross the Webcam and hit the second Timepix
    3. Check if on the Timepix one can see a shadow of the CCD of the Webcam
    4. If that works check (with less particles) if matching hits can be found (see method 1)

Measurements in T9

  1. Repeat the background measurement (this has to be done with each Webcam individually. We have 3 and the team from Italy will bring more)
  2. Install the sandwich on a movable table
  3. Position it far away from the beam axis
  4. Acquire a few images with beam off. Count the number of signals (candidate particles) seen by the Webcam
  5. Turn the beam on
  6. Acquire a few images. Check if the number of candidates increases (maybe we can already see a background effect fro scattered particles)
  7. Move the Webcam, very careful, closer to the beam axis. At each step check if the rate of candidates increases
  8. As soon as an increased rate is seen move the Webcam to a safe position and repeat the background measurement
  9. Check if the exposure of the Webcam to the beam has caused any damage (white pixels) buy comparing with the calibration data
  10. Repeat the steps listed above at different beam energies
  11. Have some clever ideas for additional measurements

 

Development

Date Action

Revision 162015-07-08 - TimBrooks

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
External article with useful information
Added:
>
>
 

Development

Date Action
Line: 138 to 140
 Here is an image from the webcam with the sensor covered and the gain and a brightness set to very high values. Hot pixels can be seen directly.
$name
Added:
>
>

Hardware

Dimensions

The active area of the sensor appears to be 4mm x 3mm. Given the maximum resolution of 1280x960 pixels, the pixels are 3.125 microns square.

Mounting

To give flexibility with positioning, the plan is to mount the webcam rigidly between two TimePix sensors. All three can be aligned together and moved, if necessary, without disturbing the alignment.
 

Contacts

Revision 152015-07-03 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 12 to 12
 
2015-06-15 Pixelman s/w installed on centos 7 , connected TimePix device to PC.
2015-06-12 Met with Jerome Alexandre Alozy for TimePix. Decided to use webcam between 2 TimePix to compare the webcam results.
2015-06-04 ffmpeg s/w installed on bl4sdaq PC ,tested.
Added:
>
>
2015-07-03 Meeting with Jerome to work with several medipix devices in one readout. Request for second Timepix and radioactive source.
 

Software

Revision 142015-07-01 - TimBrooks

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 7 to 7
 

Development

Date Action
Added:
>
>
2015-06-23 Managed to disable MJPEG compression on camera
 
2015-06-16 Vide for Linux (V4L2) installed on bl4sdaq/blrsdaq1 PC for controlling webcam from Linux.
yum install v4l-utils
2015-06-15 Pixelman s/w installed on centos 7 , connected TimePix device to PC.
2015-06-12 Met with Jerome Alexandre Alozy for TimePix. Decided to use webcam between 2 TimePix to compare the webcam results.
Line: 65 to 66
  Boolean. Default: true Current: true
Added:
>
>
The camera format can be set like so:
> v4l2-ctl --set-fmt-video=width=1280,height=960,pixelformat=0

The pixel formats are listed with:

> v4l2-ctl --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
   Index       : 0
   Type        : Video Capture
   Pixel Format: 'YUYV'
   Name        : YUV 4:2:2 (YUYV)
...
      Size: Discrete 1280x960
         Interval: Discrete 0.133 s (7.500 fps)
         Interval: Discrete 0.200 s (5.000 fps)

   Index       : 1
   Type        : Video Capture
   Pixel Format: 'MJPG' (compressed)
   Name        : MJPEG
...
      Size: Discrete 1280x960
         Interval: Discrete 0.033 s (30.000 fps)
         Interval: Discrete 0.040 s (25.000 fps)
         Interval: Discrete 0.050 s (20.000 fps)
         Interval: Discrete 0.067 s (15.000 fps)
         Interval: Discrete 0.100 s (10.000 fps)
         Interval: Discrete 0.200 s (5.000 fps)
 Image files can be created by encoding the data as png and writing it to a file per frame:
> gst-launch -v -e v4l2src ! pngenc snapshot=false ! multifilesink location="frame_%05d.png"

Revision 132015-06-23 - TimBrooks

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 113 to 113
 
Added:
>
>

References

 

Open issues

Description Action by Status

Revision 122015-06-23 - TimBrooks

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 65 to 65
  Boolean. Default: true Current: true
Added:
>
>
Image files can be created by encoding the data as png and writing it to a file per frame:
> gst-launch -v -e v4l2src ! pngenc snapshot=false ! multifilesink location="frame_%05d.png"
 Camera properties can be set during data acquisition using v4l2-ctl. The configuration parameters of the Logitech C270 are as follows:
> v4l2-ctl --list-ctrls
                     brightness (int)    : min=0 max=255 step=1 default=128 value=225

Revision 112015-06-19 - TimBrooks

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 7 to 7
 

Development

Date Action
Changed:
<
<
2015-06-16 Vide for Linux (V4L2) installed on bl4sdaq/blrsdaq1 PC for controlling webcam from Linux.
yum install v4l-utils
>
>
2015-06-16 Vide for Linux (V4L2) installed on bl4sdaq/blrsdaq1 PC for controlling webcam from Linux.
yum install v4l-utils
 
2015-06-15 Pixelman s/w installed on centos 7 , connected TimePix device to PC.
2015-06-12 Met with Jerome Alexandre Alozy for TimePix. Decided to use webcam between 2 TimePix to compare the webcam results.
2015-06-04 ffmpeg s/w installed on bl4sdaq PC ,tested.
Line: 101 to 101
  gain may also play a factor - to be explored.
Added:
>
>

Example image

Here is an image from the webcam with the sensor covered and the gain and a brightness set to very high values. Hot pixels can be seen directly.
$name
 

Contacts

Line: 113 to 117
 
Look for Linux compatible S/W for reading single images via USB from the camera Candan to be started
Understanding the camera Candan Question: can we control the exposure time? If necessary Markus will contact Logitech about technical support
Deleted:
<
<

 -- TimBrooks - 2015-06-03
Added:
>
>
META FILEATTACHMENT attachment="webcam_example.png" attr="" comment="Webcam high gain image" date="1434724199" name="webcam_example.png" path="webcam_example.png" size="19038" user="brooks" version="1"

Revision 102015-06-19 - TimBrooks

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 15 to 15
 

Software

Preview webcam with gstreamer:
Changed:
<
<
gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
>
>
> gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
 Debugging can be enabled with an environment variable:
Changed:
<
<
GST_DEBUG=3 gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
>
>
> GST_DEBUG=3 gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
 Elements are connected with `!` characters, v4l2src supplies webcam data, timeoverlay takes image data and timestamps it, autovideosink creates a X11 window and renders the image data into it. Elements can be interrogated for their configuration, e.g.:
Changed:
<
<
gst-inspect v4l2src
>
>
> gst-inspect v4l2src
 ... Element Properties: name : The name of the object
Line: 42 to 42
 
flags
readable, writable String. Default: "/dev/video0" Current: "/dev/video0" device-name : Name of the device
Deleted:
<
<
libv4lconvert: warning more framesizes then I can handle! libv4lconvert: warning more framesizes then I can handle!
 
flags
readable String. Default: null Current: "UVC Camera (046d:0825)" device-fd : File descriptor of the device
Line: 81 to 79
  exposure_auto (menu) : min=0 max=3 default=3 value=1 exposure_absolute (int) : min=1 max=10000 step=1 default=166 value=10000 exposure_auto_priority (bool) : default=0 value=0
Deleted:
<
<
brightness (int) : min=0 max=255 step=1 default=128 value=225 contrast (int) : min=0 max=255 step=1 default=32 value=128 saturation (int) : min=0 max=255 step=1 default=32 value=32 white_balance_temperature_auto (bool) : default=1 value=0 gain (int) : min=0 max=255 step=1 default=64 value=131 power_line_frequency (menu) : min=0 max=2 default=2 value=0 white_balance_temperature (int) : min=0 max=10000 step=10 default=4000 value=1070 sharpness (int) : min=0 max=255 step=1 default=24 value=24 backlight_compensation (int) : min=0 max=1 step=1 default=0 value=0
 
Changed:
<
<
Check individual values with -C e.g.:
>
>
Check individual values with -C <property> e.g.:
 
> v4l2-ctl -C brightness
brightness: 225
Added:
>
>
Set values with -c <property> = value e.g.:
> v4l2-ctl -c exposure_absolute=10000

Properties

exposure_absolute is the shutter length in counts of $100\,\mathrm{\mu s}$. At 10,000 this corresponds to one frame per second.

exposure_auto must be disabled to prevent the camera changing exposure in response to signal levels. Value 3 is enabled, value 1 is disabled. Values 0 and 2 are not valid.

white_balance_temperature_auto should be disabled - set to 0.

power_line_frequency should be set to 0 to disable any compensation.

brightness can be set quite high while maintaining black levels. Around 240 seems to show hot pixels on the sensor.

gain may also play a factor - to be explored.

 

Contacts

Revision 92015-06-19 - TimBrooks

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 11 to 11
 
2015-06-15 Pixelman s/w installed on centos 7 , connected TimePix device to PC.
2015-06-12 Met with Jerome Alexandre Alozy for TimePix. Decided to use webcam between 2 TimePix to compare the webcam results.
2015-06-04 ffmpeg s/w installed on bl4sdaq PC ,tested.
Deleted:
<
<
 
Added:
>
>

Software

Preview webcam with gstreamer:
gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
Debugging can be enabled with an environment variable:
GST_DEBUG=3 gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
Elements are connected with `!` characters, v4l2src supplies webcam data, timeoverlay takes image data and timestamps it, autovideosink creates a X11 window and renders the image data into it. Elements can be interrogated for their configuration, e.g.:
gst-inspect v4l2src
...
Element Properties:
  name                : The name of the object
                        flags: readable, writable
                        String. Default: null Current: "v4l2src0"
  blocksize           : Size in bytes to read per buffer (-1 = default)
                        flags: readable, writable
                        Unsigned Long. Range: 0 - 18446744073709551615 Default: 4096 Current: 4096
  num-buffers         : Number of buffers to output before sending EOS (-1 = unlimited)
                        flags: readable, writable
                        Integer. Range: -1 - 2147483647 Default: -1 Current: -1
  typefind            : Run typefind before negotiating
                        flags: readable, writable
                        Boolean. Default: false Current: false
  do-timestamp        : Apply current stream time to buffers
                        flags: readable, writable
                        Boolean. Default: false Current: false
  device              : Device location
                        flags: readable, writable
                        String. Default: "/dev/video0" Current: "/dev/video0"
  device-name         : Name of the device
libv4lconvert: warning more framesizes then I can handle!
libv4lconvert: warning more framesizes then I can handle!
                        flags: readable
                        String. Default: null Current: "UVC Camera (046d:0825)"
  device-fd           : File descriptor of the device
                        flags: readable
                        Integer. Range: -1 - 2147483647 Default: -1 Current: -1
  flags               : Device type flags
                        flags: readable
                        Flags "GstV4l2DeviceTypeFlags" Default: 0x00000000, "(none)" Current: 0x00000000, "(none)"
                           (0x00000001): capture          - Device supports video capture
                           (0x00000002): output           - Device supports video playback
                           (0x00000004): overlay          - Device supports video overlay
                           (0x00000010): vbi-capture      - Device supports the VBI capture
                           (0x00000020): vbi-output       - Device supports the VBI output
                           (0x00010000): tuner            - Device has a tuner or modulator
                           (0x00020000): audio            - Device has audio inputs or outputs
  queue-size          : Number of buffers to be enqueud in the driver in streaming mode
                        flags: readable, writable
                        Unsigned Integer. Range: 1 - 16 Default: 2 Current: 2
  always-copy         : If the buffer will or not be used directly from mmap
                        flags: readable, writable
                        Boolean. Default: true Current: true

Camera properties can be set during data acquisition using v4l2-ctl. The configuration parameters of the Logitech C270 are as follows:

> v4l2-ctl --list-ctrls
                     brightness (int)    : min=0 max=255 step=1 default=128 value=225
                       contrast (int)    : min=0 max=255 step=1 default=32 value=128
                     saturation (int)    : min=0 max=255 step=1 default=32 value=32
 white_balance_temperature_auto (bool)   : default=1 value=0
                           gain (int)    : min=0 max=255 step=1 default=64 value=131
           power_line_frequency (menu)   : min=0 max=2 default=2 value=0
      white_balance_temperature (int)    : min=0 max=10000 step=10 default=4000 value=1070
                      sharpness (int)    : min=0 max=255 step=1 default=24 value=24
         backlight_compensation (int)    : min=0 max=1 step=1 default=0 value=0
                  exposure_auto (menu)   : min=0 max=3 default=3 value=1
              exposure_absolute (int)    : min=1 max=10000 step=1 default=166 value=10000
         exposure_auto_priority (bool)   : default=0 value=0
                     brightness (int)    : min=0 max=255 step=1 default=128 value=225
                       contrast (int)    : min=0 max=255 step=1 default=32 value=128
                     saturation (int)    : min=0 max=255 step=1 default=32 value=32
 white_balance_temperature_auto (bool)   : default=1 value=0
                           gain (int)    : min=0 max=255 step=1 default=64 value=131
           power_line_frequency (menu)   : min=0 max=2 default=2 value=0
      white_balance_temperature (int)    : min=0 max=10000 step=10 default=4000 value=1070
                      sharpness (int)    : min=0 max=255 step=1 default=24 value=24
         backlight_compensation (int)    : min=0 max=1 step=1 default=0 value=0

Check individual values with -C e.g.:

> v4l2-ctl -C brightness
brightness: 225
 

Contacts

Revision 82015-06-17 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal
Proposal video
Line: 7 to 7
 

Development

Date Action
Changed:
<
<
2015-06-04 WEBCAM, 3MP C270, LOGITECH was ordered from Farnell Element 14 Electronics store.
>
>
2015-06-16 Vide for Linux (V4L2) installed on bl4sdaq/blrsdaq1 PC for controlling webcam from Linux.
yum install v4l-utils
2015-06-15 Pixelman s/w installed on centos 7 , connected TimePix device to PC.
2015-06-12 Met with Jerome Alexandre Alozy for TimePix. Decided to use webcam between 2 TimePix to compare the webcam results.
2015-06-04 ffmpeg s/w installed on bl4sdaq PC ,tested.

 

Contacts

Revision 72015-06-16 - TimBrooks

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Deleted:
<
<
<!-- 
  • Set ALLOWTOPICVIEW = TimBrooks, bl4s-detectors
  • Set ALLOWTOPICCHANGE = TimBrooks, bl4s-detectors
-->
 Project Proposal
Proposal video
External article with useful information

Revision 62015-06-11 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
<!-- 
  • Set ALLOWTOPICVIEW = TimBrooks, bl4s-detectors

Revision 52015-06-11 - TimBrooks

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Added:
>
>
<!-- 
  • Set ALLOWTOPICVIEW = TimBrooks, bl4s-detectors
  • Set ALLOWTOPICCHANGE = TimBrooks, bl4s-detectors
-->
 Project Proposal
Proposal video
External article with useful information

Revision 42015-06-05 - MarkusJoos

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Changed:
<
<
Project Proposal Proposal video
>
>
Project Proposal
Proposal video
External article with useful information
 

Development

Revision 32015-06-04 - TimBrooks

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Project Proposal Proposal video

Development

Changed:
<
<
2015-06-04
WEBCAM, 3MP C270, LOGITECH was ordered from Farnell Element 14 Electronics store.
>
>
Date Action
2015-06-04 WEBCAM, 3MP C270, LOGITECH was ordered from Farnell Element 14 Electronics store.
 

Contacts

Revision 22015-06-04 - CandanDozen

Line: 1 to 1
 
META TOPICPARENT name="MajorTasks"
Deleted:
<
<
 Project Proposal Proposal video
Line: 6 to 5
 

Development

Changed:
<
<
2015-06-03
>
>
2015-06-04
WEBCAM, 3MP C270, LOGITECH was ordered from Farnell Element 14 Electronics store.
 

Contacts

Revision 12015-06-03 - MarkusJoos

Line: 1 to 1
Added:
>
>
META TOPICPARENT name="MajorTasks"

Project Proposal Proposal video

Development

2015-06-03

Contacts

Open issues

Description Action by Status
We should by the same Web cam as they have Candan Looking for a supplier
Look for Linux compatible S/W for reading single images via USB from the camera Candan to be started
Understanding the camera Candan Question: can we control the exposure time? If necessary Markus will contact Logitech about technical support

-- TimBrooks - 2015-06-03

 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback