Project Proposal
Proposal video
External article with useful information

Work plan for the execution of the experiment

Below is a list of steps that should be executed during the beam tie in September. NOTE: This is just the vision of Markus. Input from other people is required and milestones have to be formulated

Preparation in the lab

  1. Install one Webcam and Two Timepix in the sandwich structure
  2. Buy a USB hub (don't wait, do it now) :
  3. Connect the Webcam and the Timepix to the USB hub :
  4. Connect the hub to the Leo4G DAQ PC (this is a retired GEN-II ROS with CC7):
  5. Install PixelMan on the PC as well as the S/W that is required to read the Webcam:
  6. Set up a script that allows acquiring images from TimePix and Webcam at the same time

  1. * Webcam modified by removing the glass protective filter on the sensor and installed in the sandwich structure with two Timepix.
  2. * 7-Port Powered Mobile Hub bought.
  3. * The connection chain is completed:Webcam & Timepix -> USB hub ->Leo4G DAQ PC
  4. * PixelMan and OpenCV installed for Timepix and Webcam respectively.

Background measurement

  1. Tape the lens of the Webcam to keep ambient light outside
  2. Acquire a few images. Identify:
    1. Pixels that are never black
    2. Pixels that are sometimes not black
    3. Save the coordinates of these pixels in a calibration file.
  3. Repeat the process above at different temperatures. Use dry ice or the Pelletier cooler to cool the Webcam
  4. Decide if the Webcam needs cooling in T9

Alignment

  1. Method 1:
    1. Acquire data will all 3 sensors until the Timepix has seen a few cosmic particles
    2. Look for (cosmic) particles that crossed the two Timepix at ~the same time
    3. Estimate where the particle seen by the Timepix should have crossed the Webcam
    4. Check if the Webcam has a white pixel in the expected region
  2. Method 2: (Maybe to be executed before method 1)
    1. Put a radioactive source in front of the Webcam (remove the first Timepix)
    2. The radiation has to cross the Webcam and hit the second Timepix
    3. Check if on the Timepix one can see a shadow of the CCD of the Webcam
    4. If that works check (with less particles) if matching hits can be found (see method 1)

Measurements in T9

  1. Repeat the background measurement (this has to be done with each Webcam individually. We have 3 and the team from Italy will bring more)
  2. Install the sandwich on a movable table
  3. Position it far away from the beam axis
  4. Acquire a few images with beam off. Count the number of signals (candidate particles) seen by the Webcam
  5. Turn the beam on
  6. Acquire a few images. Check if the number of candidates increases (maybe we can already see a background effect fro scattered particles)
  7. Move the Webcam, very careful, closer to the beam axis. At each step check if the rate of candidates increases
  8. As soon as an increased rate is seen move the Webcam to a safe position and repeat the background measurement
  9. Check if the exposure of the Webcam to the beam has caused any damage (white pixels) buy comparing with the calibration data
  10. Repeat the steps listed above at different beam energies
  11. Have some clever ideas for additional measurements

Development

Date Action
2015-07-20 Opencv installed on pcbl4sleo4g PC.
2015-07-16 Centos 7 installed in new hard disk. PC name is: pcbl4sleo4g.
2015-06-23 Managed to disable MJPEG compression on camera
2015-06-16 Vide for Linux (V4L2) installed on bl4sdaq/blrsdaq1 PC for controlling webcam from Linux.
yum install v4l-utils
2015-06-15 Pixelman s/w installed on centos 7 , connected TimePix device to PC.
2015-06-12 Met with Jerome Alexandre Alozy for TimePix. Decided to use webcam between 2 TimePix to compare the webcam results.
2015-06-04 ffmpeg s/w installed on bl4sdaq PC ,tested.
2015-07-03 Meeting with Jerome to work with several medipix devices in one readout. Request for second Timepix and radioactive source.

Software

Preview webcam with gstreamer:

> gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
Debugging can be enabled with an environment variable:
> GST_DEBUG=3 gst-launch-0.10 -v -e v4l2src ! timeoverlay ! autovideosink
Elements are connected with `!` characters, v4l2src supplies webcam data, timeoverlay takes image data and timestamps it, autovideosink creates a X11 window and renders the image data into it. Elements can be interrogated for their configuration, e.g.:
> gst-inspect v4l2src
...
Element Properties:
  name                : The name of the object
                        flags: readable, writable
                        String. Default: null Current: "v4l2src0"
  blocksize           : Size in bytes to read per buffer (-1 = default)
                        flags: readable, writable
                        Unsigned Long. Range: 0 - 18446744073709551615 Default: 4096 Current: 4096
  num-buffers         : Number of buffers to output before sending EOS (-1 = unlimited)
                        flags: readable, writable
                        Integer. Range: -1 - 2147483647 Default: -1 Current: -1
  typefind            : Run typefind before negotiating
                        flags: readable, writable
                        Boolean. Default: false Current: false
  do-timestamp        : Apply current stream time to buffers
                        flags: readable, writable
                        Boolean. Default: false Current: false
  device              : Device location
                        flags: readable, writable
                        String. Default: "/dev/video0" Current: "/dev/video0"
  device-name         : Name of the device
                        flags: readable
                        String. Default: null Current: "UVC Camera (046d:0825)"
  device-fd           : File descriptor of the device
                        flags: readable
                        Integer. Range: -1 - 2147483647 Default: -1 Current: -1
  flags               : Device type flags
                        flags: readable
                        Flags "GstV4l2DeviceTypeFlags" Default: 0x00000000, "(none)" Current: 0x00000000, "(none)"
                           (0x00000001): capture          - Device supports video capture
                           (0x00000002): output           - Device supports video playback
                           (0x00000004): overlay          - Device supports video overlay
                           (0x00000010): vbi-capture      - Device supports the VBI capture
                           (0x00000020): vbi-output       - Device supports the VBI output
                           (0x00010000): tuner            - Device has a tuner or modulator
                           (0x00020000): audio            - Device has audio inputs or outputs
  queue-size          : Number of buffers to be enqueud in the driver in streaming mode
                        flags: readable, writable
                        Unsigned Integer. Range: 1 - 16 Default: 2 Current: 2
  always-copy         : If the buffer will or not be used directly from mmap
                        flags: readable, writable
                        Boolean. Default: true Current: true

The camera format can be set like so:

> v4l2-ctl --set-fmt-video=width=1280,height=960,pixelformat=0

The pixel formats are listed with:

> v4l2-ctl --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
   Index       : 0
   Type        : Video Capture
   Pixel Format: 'YUYV'
   Name        : YUV 4:2:2 (YUYV)
...
      Size: Discrete 1280x960
         Interval: Discrete 0.133 s (7.500 fps)
         Interval: Discrete 0.200 s (5.000 fps)

   Index       : 1
   Type        : Video Capture
   Pixel Format: 'MJPG' (compressed)
   Name        : MJPEG
...
      Size: Discrete 1280x960
         Interval: Discrete 0.033 s (30.000 fps)
         Interval: Discrete 0.040 s (25.000 fps)
         Interval: Discrete 0.050 s (20.000 fps)
         Interval: Discrete 0.067 s (15.000 fps)
         Interval: Discrete 0.100 s (10.000 fps)
         Interval: Discrete 0.200 s (5.000 fps)

Image files can be created by encoding the data as png and writing it to a file per frame:

> gst-launch -v -e v4l2src ! pngenc snapshot=false ! multifilesink location="frame_%05d.png"

Camera properties can be set during data acquisition using v4l2-ctl. The configuration parameters of the Logitech C270 are as follows:

> v4l2-ctl --list-ctrls
                     brightness (int)    : min=0 max=255 step=1 default=128 value=225
                       contrast (int)    : min=0 max=255 step=1 default=32 value=128
                     saturation (int)    : min=0 max=255 step=1 default=32 value=32
 white_balance_temperature_auto (bool)   : default=1 value=0
                           gain (int)    : min=0 max=255 step=1 default=64 value=131
           power_line_frequency (menu)   : min=0 max=2 default=2 value=0
      white_balance_temperature (int)    : min=0 max=10000 step=10 default=4000 value=1070
                      sharpness (int)    : min=0 max=255 step=1 default=24 value=24
         backlight_compensation (int)    : min=0 max=1 step=1 default=0 value=0
                  exposure_auto (menu)   : min=0 max=3 default=3 value=1
              exposure_absolute (int)    : min=1 max=10000 step=1 default=166 value=10000
         exposure_auto_priority (bool)   : default=0 value=0

Check individual values with -C <property> e.g.:

> v4l2-ctl -C brightness
brightness: 225

Set values with -c <property> = value e.g.:

> v4l2-ctl -c exposure_absolute=10000

Properties

exposure_absolute is the shutter length in counts of $100\,\mathrm{\mu s}$. At 10,000 this corresponds to one frame per second.

exposure_auto must be disabled to prevent the camera changing exposure in response to signal levels. Value 3 is enabled, value 1 is disabled. Values 0 and 2 are not valid.

white_balance_temperature_auto should be disabled - set to 0.

power_line_frequency should be set to 0 to disable any compensation.

brightness can be set quite high while maintaining black levels. Around 240 seems to show hot pixels on the sensor.

gain may also play a factor - to be explored.

Example image

Here is an image from the webcam with the sensor covered and the gain and a brightness set to very high values. Hot pixels can be seen directly.
$name

Read Webcam and Analyze the image

Log in the "pcbl4sleo4g" PC as a daquser.(examp:ssh -Y daquser@pcbl4sleo4g / password : BeamLine15)

Go to "Webcam" directory.

In order to control the webcam go to the "Trigger" directory

/afs/cern.ch/user/d/daquser/public/Webcam/Trigger

To compile it (if you have made changes to the source code):

cmake .

make

Camera properties are set in the "run.sh" script is created.

run.sh

 
###############################################################################################################
#! /bin/bash
# Switch off automation
v4l2-ctl -c exposure_auto_priority=0
v4l2-ctl -c exposure_auto=1
v4l2-ctl -c backlight_compensation=0
v4l2-ctl -c white_balance_temperature_auto=0
v4l2-ctl -c power_line_frequency=0

# Set shutter to 1 second
v4l2-ctl -c exposure_absolute=10000
# Amplify as much as possible
v4l2-ctl -c gain=255
v4l2-ctl -c contrast=255
v4l2-ctl -c brightness=242

v4l2-ctl --set-fmt-video=width=1280,height=960,pixelformat=0
#v4l2-ctl -c --set-param=1

# Non-critical
v4l2-ctl -c saturation=32
v4l2-ctl -c white_balance_temperature=6500
./Trigger
###############################################################################################################

After starting the bash script ("=./run.sh"= ) two windows will be appear on the screen.

Title of window 1 (W1): Trigger (on ) Title of window 2 (W2) : Webcam (on )


$name

The purpose of W1 is to control the settings of the WebCam. You can, by moving the sliders, change the settings of the camera. The parameters are defined as follows:

Parameter Description
Threshold (Pixel Value) A pixel can have a saturation from 0 (black) to 255 (white). With this threshold you set the limit above which a pixel will be saved in the output file. With e.g. threshold set to 150 a pixel that has a saturation of e.g. 100 will not be recorded. The purpose of this parameter is to allow you to reduce the amount of noise in the picture.
Num of active Pixel This is not a control parameter but the number of pixels above threshold in the current frame.
Threshold (num of pixel) A picture will only be recorded if the number of pixels above threshold (see first parameter) is larger than this number. If this parameter is set to 50 and a frame has only e.g. 39 pixels above threshold no file will be recorded. This parameter also helps you to suppress background
Exposure This is the exposure time of a frame in s.

The area below the sliders shows you the last frame that was captured by the camera. It shows all frames, not only those that are recorded. The second Window, W2 also shows the current frame but only the frames that pass the trigger cut (threshold and active pixel numbers) will be saved.

Leave it running for a while . Images (1 fps (or whatever exposure time you have set)) will be saved with ".tif " extensions in the "output" directory.

You can find the "Trigger "program written by opencv is in git repository: link to Git Repository for Triggering webcam with opencv

Example of background image:


$name

Background Mesurement

The program "BG_Measurement.cpp" has been developed to analyse the images taken by the webcam. It uses the Root analysis framework for the generation of histograms.

Let's first build the binaries:

cd /afs/cern.ch/user/d/daquser/public/Webcam/Analysis/

To compile it:

./compile-root.sh BG_Measurement

This will create the file BG_Measurement.exe.

Before using the application and Root you must set up your environment. Execute these commands:

  export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH
  source root/bin/thisroot.sh

The program BG_Measurement.cpp is command line driven. You have to specify the number of files which will be analyzed (e.g 150 ) and threshold need to be set (e.g 128).

Example

./BG_Measurement.exe  -n 150 -t 128

The program will read the number of files and threshold specified in the command line. it will collect and sum all background images from "output " directory (afs/cern.ch/user/d/daquser/public/Webcam/Trigger/output) in order to produce a single background image.

Finally it will generate two files with the name background.root and BG_Total.tif. Please save these files immediately in your data area because it will be overwritten when you execute BG_Measurement the next time. You can display the file (under Linux) with the command "display BG_Total.tif " .

If you would like to browse the output root file type the following commands in terminal ;

"root -l background.root " ---> it allows us to load root file

"new TBrowser " ---> it opens a browser to see the histograms.

Also, on the displayed image you will find two histograms. The first one has the title "BW_SCALE". The X axis represents the horizontal pixels of the WebCam and the Y-axis the vertical pixels. Therefore each dot in the histogram represents one pixel. The color of the pixel in the histogram tells you how often the respective pixel was above threshold in the input images. The purpose is to detect pixels that are not working properly If you put the WebCam into a dark box and record images, all images should ideally be black. A CCD may have two types of dead pixels:

  1. Pixels that are always white (or at least not black)
  2. Pixels that flicker (i.e. change color from image to image)
In the "BW_SCALE histogram you can see, via the color grading, which pixels are dead (they should be red) and which are flickering (they are for example green or blue). This information will help you later to decide if a pixel has been hit by a particle.


$name

The second histogram ("hvalue") shows on the x-axis the saturation of the pixels from 0 (black) to 255 (white). The value on the Y-axis is the number of pixels with that saturation.
$name

Once you have identified the dead and flickering pixels of your CCD you should carefully archive the result. This file (*.tif format), lets call it the calibration frame (CF), will be required by the next step.

Imagine you take one frame with the WebCam exposed to beam. Let's call this the beam_frame (BF). In order to tell which pixels in the BF have been hit by the beam you have to filter out the malfunctioning pixels. Mathematically this means: Pixels hit by beam = BF - CF.

Such a subtraction of two images can be done with the program Subtract_images.cpp.

Before you start Substract_images you have to execute:

  
export LD_LIBRARY_PATH=/usr/local/lib:$LD_LIBRARY_PATH
source root/bin/thisroot.sh

Second method for Analysis with "opencv" :

Background Mesurement

The program "BackgroundCreator.cpp" produce an output background image with "*.tif " extension by taking average of the all frames which are taken from webcam as background.

cd /afs/cern.ch/user/d/daquser/public/Webcam/Image_Analyis_withOpenCV/opencv/Analysis/BackgroundCreator

To compile it (if you have made changes to the source code):

cmake .

make

To reads each background images from the output directory "run_background.sh" script is created.

run_background.sh

###############################################################################################################
#!/bin/bash

BACKGROUNDS="../webcam_bg_output/"
COMBINED=""

for file in "$BACKGROUNDS/*.tiff"; do
  COMBINED="$COMBINED $file"
done

./BackgroundCreator $COMBINED out.tif
###############################################################################################################

After starting the bash script ./run_background.sh "out.tif" file will be created. Once we have this file we can go to the next step.

Subtraction and Analysis

"Substract.cpp" program has been developed to have a clear signal by subtracting calibration frame from each beam frames as it mentioned in previous section. it also count the number of pixels for each frames. We can analyze the subtracting images with "*Analysis.cpp*" program. This program gives us information about the number of pixels with the saturation value from 0 (black) to 255 (white) within .txt file.

cd /afs/cern.ch/user/d/daquser/public/Webcam/Image_Analyis_withOpenCV/opencv/Analysis

Both program can compile in a same way if you have made any changes in the codes :

cmake .

make

To execute these programs bash script "run.sh" was created.

After starting the script "=./run.sh=" 2 different files will be produced with "*.tif " extension files from Subtract.cpp and "*.txt " extension files from Analysis.cpp program.

###############################################################################################################
#!/bin/bash

CURRENT_DIR=`pwd`
OUTPUT="$CURRENT_DIR/output"  ////create an "output" folder to write the produced files.
CAPTURED="$CURRENT_DIR/webcam_data"   ////read beam frames from "webcam_data" folder 

SUBTRACT_THRESHOLD=128
SUBTRACT_BACKGROUND="$CURRENT_DIR/BackgroundCreator/out.tif"   ////read the calibration frame

////loop for subtract calibration frame("out.tif") from each beam frame.
cd $CAPTURED
for file in *.tif; do
    echo "processing $file..."
    $CURRENT_DIR/Subtract $file $SUBTRACT_BACKGROUND $SUBTRACT_THRESHOLD "$OUTPUT/$file.subtracted.tif"
    $CURRENT_DIR/Analysis "$OUTPUT/$file.subtracted.tif" > "$OUTPUT/$file.txt"   ////create a .txt files for each subtracting images
done
###############################################################################################################

You can find the "Analysis "programs written by opencv is in git repository: link to Git Repository for Image Analysis with opencv

Test with Radioactive Source

Strontium radioactive source was used: In order to estimate background several images were taken from webcam without any source and afterwards the experiment was repeated with using radioactive source.

Example background image:


$name

Analyze the Bacgrount Image with BG_Measurement.cpp program

cd /afs/cern.ch/user/d/daquser/public/Webcam/Image_Analyis_withOpenCV/opencv/Analysis

Attempt several Threshold value to find the ideal Threshold.

execute the BG_Measurement.exe for 63 file with threshold value 25

./BG_Measurement.exe -n 63 -t 25 > BG_output.txt

"BG_output.txt" has information about image like: Dead Pixels and their coordinates, Pixel values above threshold and their coordinates, Number of Pixels with saturation values.

* BG_output.txt: Contains information about BG images

According to information from BG_output.txt file , the histogram of the background image seems as follows.


$name

1.First Timepix was removed. Radioactive source was put in front of the Webcam.

Example image with using radioactive source :


$name

Hardware

Dimensions

The active area of the sensor appears to be 4mm x 3mm. Given the maximum resolution of 1280x960 pixels, the pixels are 3.125 microns square.

Mounting

To give flexibility with positioning, the plan is to mount the webcam rigidly between two TimePix sensors. All three can be aligned together and moved, if necessary, without disturbing the alignment.

SHIFT INSTRUCTION

Login as a daquser to pcbl4sle04g pc : ssh -Y daquser@pcbl4sleo4gNOSPAMPLEASE.cern.ch (Password: BeamLine15)

1.BACKGROUND Data taking with webcam:

Flow chart:


$name

Go to "Trigger" directory:

cd /afs/cern.ch/user/d/daquser/public/Webcam/Trigger

Create a new folder to write the background output files in it

mkdir output_backgorund

Define the name of the output folder in Trigger.cpp code:

sprintf(fileName, "output_background/frame_%d.tif", frameNumber);

After having all changes in the code we need to compile it :

cmake .

make

To execute the Trigger.cpp :

STAGE 1:

./run.sh

STAGE 2:

After starting the bash script ("=./run.sh"= ) two windows will be appear on the screen. Look at the figure below.

Title of window 1 (W1): Trigger (on ) Title of window 2 (W2) : Webcam (on )


$name

The purpose of W1 is to control the settings of the WebCam.

STAGE 3:

You can, by moving the sliders on Track bar (W1) , change the threshold settings of the camera or in Trigger.cpp program. In the code:

int pixelValueThreshold = 150 ([changeable] ); ///each pixel value must be greater than 150 In order to choose active (white) pixels.-->we called it as "Dead Pixels"
int numOfPixelThreshold = 50; ([changeable]); /// at least 50 pixels must have pixel value threshold grater than 150.

STAGE4:

If the threshold setting is ok (Stage 3) then you will see the capturing images saved in the console as follows:


$name ===> NEEDED TO ATTACHED CONSOLE IMAGE!!!!!!

STAGE 5:

Go to the output_background folder :

cd /afs/cern.ch/user/d/daquser/public/Webcam/Trigger/output_backgorund

You need to save at least 1000 frames for background.

2.DATA Taking:

Flow Chart for data taking


$name

Note that : You only need to create new folder for data taking in Trigger directory again as we have done for Background Data Taking. (example: mkdir output_data)

Execute the "Trigger.cpp" program again and run the bash script:

cmake .

make

STAGE1: Start Trigger:

./run.sh

Repeat all steps as we have done for Background Data Taking.

Start Timepix:

Go to "pixelman" directory:

cd /afs/cern.ch/user/d/daquser/public/Webcam/pixelman/Pixelman_2013_09_25_x64/

Execute "pixelman"

./pixelman.sh

The Timepix Control Panel will be appear :

Change the folder name to save the your output files Yo don't need to change any settings. Al settings must be saved as a default.

Contacts

References

Open issues

Description Action by Status
We should by the same Web cam as they have Candan Looking for a supplier
Look for Linux compatible S/W for reading single images via USB from the camera Candan to be started
Understanding the camera Candan Question: can we control the exposure time? If necessary Markus will contact Logitech about technical support

List Of Equipments

  1. 3 Logitech C270 Webcam.
  2. 2 Timepix with FITPIX.
  3. 5 SATA Hard Disk
  4. 1 USB Hub (7-Port Powered Mobile Hub) and cables

-- TimBrooks - 2015-06-03

Topic attachments
I Attachment History Action Size Date Who Comment
Texttxt BG_output.txt r1 manage 28.1 K 2015-09-02 - 01:51 CandanDozen Contains information about BG images
PNGpng BG_without_source.png r1 manage 23.0 K 2015-09-02 - 02:02 CandanDozen Histogram of BG image
PNGpng BW_SCALE.png r1 manage 14.2 K 2015-08-31 - 12:18 CandanDozen  
PNGpng Background_Taking_Chart.png r1 manage 121.9 K 2015-09-06 - 23:08 CandanDozen  
PNGpng Data_Taking_Chart.png r1 manage 89.1 K 2015-09-07 - 01:26 CandanDozen  
PNGpng Webcam.png r1 manage 124.1 K 2015-08-13 - 13:42 CandanDozen  
PNGpng background.png r1 manage 230.9 K 2015-09-01 - 23:53 CandanDozen  
PNGpng frame_0.png r1 manage 53.3 K 2015-08-31 - 23:43 CandanDozen  
PNGpng hvalue.png r1 manage 10.7 K 2015-08-31 - 12:16 CandanDozen  
JPEGjpg str_webcam.jpg r1 manage 62.0 K 2015-09-01 - 23:34 CandanDozen  
PNGpng webcam_example.png r1 manage 18.6 K 2015-06-19 - 16:29 TimBrooks Webcam high gain image
Edit | Attach | Watch | Print version | History: r37 < r36 < r35 < r34 < r33 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r37 - 2015-09-07 - CandanDozen
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    BL4S All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright & 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback