1460710265-6ea683df-fc82-4187-8aef-cc71bf8184df

1. A lighting device capable of detecting a condition in an environment comprising:
a sensor to capture an image from a location in the environment;
a microcontroller, further comprising a processor and memory, wherein the sensor is electrically connected to the microcontroller and wherein the video frame is transmitted to the microcontroller as data;
a light source electrically connected to and controlled by the microcontroller;
a rules engine stored in the memory, wherein the processor compares the data to rules contained in the rules engine to produce an output; and
an interface through which the output is accessible;
wherein at least part of the rules engine defines an analysis of the data including the steps of:
capturing a subsequent image;
capturing a current image;
comparing the subsequent image and the current image to a known object, wherein the known object is stored in the memory; and
generating the output of the analysis with respect to a determination of whether the known object is detected in the subsequent image;

wherein at least part of the rules defining detecting the known object in the subsequent image include the steps of:
defining a subsequent background in the subsequent image and a current background in the current image by detecting an unchanging image;
detecting an anomaly between the subsequent background and current background indicating that the known object appears in either the subsequent image or the current image; and
generating the output responsive to detection of the anomaly;
wherein the output is storable in the memory.
2. A lighting device according to claim 1 wherein the analysis of the data defined by at least part of the rules engine further includes the step of generating the output of the analysis respective to whether the known object is detected at a specific location in the subsequent image.
3. A lighting device according to claim 1 wherein the analysis of the data defined by at least part of the rules engine further includes the step of generating the output of the analysis respective to whether the known object is detected at a plurality of locations in the subsequent image.
4. A lighting device according to claim 1 wherein the microcontroller is electrically connected to a network.
5. A lighting device according to claim 4 wherein the sensor comprises a plurality of sensors that are electrically connected to the network.
6. A lighting device according to claim 5 wherein the network is electrically connected to a centralized computing device; and wherein the centralized computing device analyzes the data transmitted by the plurality of sensors.
7. A lighting device according to claim 6 wherein each of the plurality of sensors are positioned at locations throughout the environment in a uniform manner; wherein the images are captured by each of the plurality of sensors in the locations throughout the environment in an approximately uniform manner; and wherein the images are concatenated to create an overview that includes substantially all of the locations throughout the environment substantially seamlessly.
8. A lighting device according to claim 7 wherein the locations throughout the environment are configured relating to an approximately grid based pattern.
9. A lighting device according to claim 5 wherein the plurality of sensors are positioned to capture the images using similar viewing angles relative to the environment.
10. A lighting device according to claim 5 wherein the plurality of sensors intercommunicate though the network using mesh networking.
11. A lighting device according to claim 1 wherein the sensor comprises a camera.
12. A lighting device according to claim 11 wherein the image is a video frame captured by the camera.
13. A lighting device according to claim 1 wherein an event is definable in the rules to relate to the known object being detected in the environment; wherein the event is associable with an action; and wherein the action may occur subsequent to detecting the event.
14. A lighting device according to claim 13 wherein the action includes generating an alert.
15. A lighting device according to claim 14 wherein the light source is operated between an on position and an off position responsive to the alert.
16. A lighting device according to claim 4 wherein the image is compressible by the microcontroller; and wherein the image that is compressed is transmitted through the network.
17. A lighting device according to claim 1 wherein the sensor captures a plurality of images; wherein the plurality of images are concatenated to create a video feed; and wherein the video feed is accessible using the interface.
18. A lighting device according to claim 17 wherein the results of the analysis performed by the rules engine are included in the video feed.
19. A lighting device according to claim 1 wherein supplemental data is viewable using the interface.
20. A lighting device according to claim 5 wherein the known object is detectable by at least two of the plurality of sensors to create a stereoscopic perspective; and wherein a parallax among the images in the stereoscopic perspective is used to calculate depth in a three-dimensional space.
21. A lighting device according to claim 1 wherein the light source comprises an LED.
22. A lighting device according to claim 1 wherein the light source is operable responsive to the output.
23. A lighting device capable of detecting a condition in the environment comprising:
a camera to capture a video frame from a location in the environment;
a microcontroller including a processor and memory that is electrically connected to the camera by a network and wherein the video frame is transmitted to the microcontroller through the network as data;
a light source electrically connected to and controlled by the microcontroller;
a rules engine stored in the memory, wherein the processor compares the data to rules contained in the rules engine to produce an output, and wherein the output controls the light source; and
an interface through which the output is accessible;
wherein at least part of the rules engine defines an analysis of the data including the steps of:
capturing a subsequent video frame;
capturing a current video frame;
comparing the subsequent video frame and the current video frame to a known object, wherein the known object is stored in the memory; and
generating the output of the analysis respective to whether the known object is detected at a specific location in the subsequent video frame; and

wherein at least part of the rules defining detecting a known object in the subsequent video frame include the steps:
defining a subsequent background in the subsequent video frame and a current background in the current video frame by detecting an unchanging image;
detecting an anomaly from the subsequent background and the current background indicating that the known object appears in either the subsequent video frame or the current video frame;
generating the output that is storable in the memory;

wherein an event is definable in the rules to relate to the known object being detected in the environment;
wherein the event is associable with an action;
wherein the action may occur subsequent to detecting the event;
wherein the action includes generating an alert; and
wherein the light source is operable responsive to the alert.
24. A lighting device according to claim 23 wherein the analysis of the data defined by at least part of the rules engine further includes the step of generating the output of the analysis respective to whether the known object is detected at a plurality of locations in the subsequent video frame.
25. A lighting device according to claim 23 wherein the network is electrically connected to a centralized computing device; and wherein the centralized computing device analyzes the data transmitted by the camera.
26. A lighting device according to claim 23 wherein the camera comprises a plurality of cameras that are positioned at locations throughout the environment in a uniform manner; wherein the plurality of cameras capture a plurality of video frames in an approximately uniform manner at the locations; and wherein the plurality of video frames are concatenated to create an overview that includes substantially all of the locations throughout the environment substantially seamlessly.
27. A lighting device according to claim 24 wherein the plurality of locations are configured relating to an approximately grid based pattern.
28. A lighting device according to claim 24 wherein the plurality of cameras are positioned to capture the video frames using similar viewing angles relative to the environment.
29. A lighting device according to claim 23 wherein the camera includes a plurality of cameras that intercommunicate though the network using mesh networking.
30. A lighting device according to claim 23 wherein the video frame is compressible by the microcontroller, and wherein the video frame that is compressed is transmitted through the network.
31. A lighting device according to claim 23 wherein the video frame comprises a plurality of video frames that are concatenated to create a video feed; and wherein the video feed is accessible using the interface.
32. A lighting device according to claim 31 wherein the results of the analysis performed by the rules engine is included in the video feed.
33. A lighting device according to claim 23 wherein supplemental data is viewable using the interface.
34. A lighting device according to claim 24 wherein the camera includes a plurality of cameras; wherein the known object is detectable by at least two of the plurality of cameras to create a stereoscopic perspective; and wherein a parallax among the video frames in the stereoscopic perspective is used to calculate depth in a three-dimensional space.
35. A method of detecting a condition in an environment comprising:
capturing a video frame from a location in the environment;
transmitting the video frame to a microcontroller as data, wherein the microcontroller comprises a processor and memory;
conducting an analysis of the data by comparing the data to rules contained on a rules engine stored in the memory to produce an output; and
making the output accessible through an interface;
wherein conducting the analysis includes the steps of:
capturing a subsequent video frame;
capturing a current video frame;
comparing the subsequent video frame and the current video frame to a known object, wherein the known object is stored in the memory; and
generating the output of the analysis with respect to a determination of whether the known object is detected in the subsequent video frame;

wherein at least part of the rules defining detecting the known object in the subsequent video frame include the steps of:
defining a subsequent background in the subsequent video frame and a current background in the current video frame by detecting an unchanging image;
detecting an anomaly between the subsequent background and current background indicating that the known object appears in either the subsequent video frame or the current video frame;
generating the output that is storable in the memory responsive to detection of the anomaly;
defining an event in the rules to relate to the known object being detected in the environment;
associating the event with an action; and
executing the action upon detecting an occurrence of the event.
36. A method according to claim 35 wherein conducting the analysis of the data further includes the step of generating the output of the analysis respective to whether the known object is detected at a specific location in the subsequent video frame.
37. A method according to claim 35 wherein conducting the analysis further includes the step of generating the output of the analysis respective to whether the known object is detected at a plurality of locations in the subsequent video frame.
38. A method according to claim 35 wherein a plurality of sensors that are electrically connected to the network are used to capture the video frame; wherein the network is electrically connected to a centralized computing device; and further comprising analyzing the data using the centralized computing device.
39. A method according to claim 36 further comprising positioning each of the plurality of sensors at locations throughout the environment in a uniform manner; capturing the video frame in an approximately uniform manner; and concatenating the video frames to create an overview that includes substantially all of the locations throughout the environment substantially seamlessly.
40. A method according to claim 37 wherein the locations throughout the environment are configured relating to an approximately grid based pattern.
41. A method according to claim 36 wherein the plurality of sensors each comprise a camera.
42. A method according to claim 35 wherein executing the action further comprises generating an alert.
43. A method according to claim 42 further comprising operating a light source responsive to the alert being generated.
44. A method according to claim 35 further comprising compressing the video frame using the microcontroller; and transmitting the video frame that is compressed through the network.
45. A method according to claim 35 wherein the video frame includes a plurality of video frames; and further comprising concatenating the plurality of video frames to create a video feed; and wherein the video feed is accessible using the interface.
46. A method according to claim 35 wherein the known object is detectable by at least two sensors to create a stereoscopic perspective; and wherein a parallax among the video frames in the stereoscopic perspective is used to calculate depth in a three-dimensional space.

The claims below are in addition to those above.
All refrences to claim(s) which appear below refer to the numbering after this setence.

What is claimed is:

1. A digital imaging apparatus comprising:
a main storage unit for storing a firmware for at least two data transfer modes for transmitting image data acquired through a camera to an external device in different manners, respectively, and also for storing a descriptor including information for identifying the firmware;
a mode selection unit for outputting a mode selection signal for a certain transfer mode for the at least two data transfer modes;
a transmitting module for connecting to the external device to transmit the image data to the external device, the transmitting module sending the image data in different data transfer manners for each data transfer mode; and
a controlling unit which controls the transmitting module so as to set the transfer manner corresponding to the selected certain transfer mode if the mode selection signal is received from the mode selection unit, and reads out descriptor of firmware corresponding to the selected transfer mode from the main storage unit and provides the descriptor to the transmitting module if the transmitting module is connected to the external device,
wherein the controlling unit allows the read descriptor of the firmware to transmit if a transfer allowance command is received from the external device.
2. The apparatus of claim 1, further comprising a sub-storage unit for storing the image data picked up through the camera,
wherein the at least two data transfer modes include a first mode wherein the image data being currently acquired through the camera is transmitted to the external device in a real-time data stream, and a second mode wherein the image data stored in the sub-storage unit is transmitted to the external device, the main storage unit storing each of the firmware corresponding to the first mode and the second mode and the identifying information.
3. The apparatus of claim 2, further comprising
an on screen display (OSD) processing unit for OSD-processing a data transfer mode selection screen for selecting the certain transfer mode of the first mode and the second mode, and outputs the result of the processing;
a display unit for displaying the OSD processed data transfer mode selection screen; and
a display request unit for generating a display request signal on the data transfer mode selection screen,
wherein if the display request signal is received from the display request unit, the controlling unit OSD-processes and displays the data transfer mode selection screen on the display unit, and receives the mode selection signal for the transfer mode selected by the mode selection unit at the displayed data transfer mode selection screen.
4. The apparatus of claim 3, wherein the transmitting module comprises:
a first-in first-out (FIFO) which is divided into a plurality of temporary storage areas, and the image data for sending to the external device and the descriptor corresponding to the selected transfer mode are temporarily stored in at least one of the temporary storage areas;
a provision unit for providing a plurality of endpoints corresponding to the plurality of temporary storage areas; and
a communication controller that selects at least three endpoints out of the plurality of the endpoints and sets the transfer manner for transferring the image data for each of the selected endpoints, and that allows the image data temporarily stored in the temporary storage areas corresponding to the selected endpoint to be sent to the external device according to the transfer type set.
5. The apparatus of claim 4, wherein if the first mode is selected by the mode selection unit, the communication controller selects endpoint numbers 0 to 2 among the plurality of endpoints, and if the second mode is selected by the mode selection unit, the communication controller selects endpoints numbers 0, 2, and 3 among the plurality of endpoints, and setting the transfer manner for each of the selected endpoints.
6. The apparatus of claim 1, wherein the transmitting module is a universal serial bus interface, and the identifying information of the firmware is stored in the location of offset 10 of the descriptor in the size of two data bytes.
7. The apparatus of claim 2, wherein if the external device and the transmitting module are connected with each other, the sub storage unit is recognized as an accessible movable disc in the external device.
8. A method for selecting data transfer mode of a digital imaging apparatus comprising the steps of:
storing image data being acquired through a camera;
selecting a certain mode of at least two modes for transferring the image data acquired through the camera to an external device in different manners;
connecting to the external device via a transmitting module so as to communicate with the external device after the certain transfer mode is selected;
receiving a transfer allowance command on a descriptor from the external device; and
transferring the descriptor including identifying information of firmware corresponding to the selected transfer mode to the external device.
9. The method of claim 8, wherein each of the firmware for the at least two transfer modes and each of the descriptors including the identifying information of the firmware are stored, respectively, the identifying information being stored in the location of offset 10 of the descriptor in the size of two data bytes.
10. The method of claim 8, prior to the step of selecting the certain transfer mode, further comprising the steps of:
generating a display request command for a data transfer mode selection screen selectable the certain transfer mode of the at least two modes; and
if the display request command is generated, on screen display (OSD)-processing the data transfer mode selection screen, thereby displaying the OSD-processed data transfer mode selection screen,
wherein the step of selecting the certain transfer mode selects the certain transfer mode from the OSD-processed data transfer mode selection screen.
11. The method of claim 8, wherein the at least two data transfer modes includes a first mode wherein the image data being currently acquired through the camera is transferred to the external device in a real-time data stream, and a second mode wherein the image data stored in the storage step is transferred to the external device, the firmware and the descriptor corresponding to the first mode and the second mode being stored, respectively.
12. The method of claim 11, after the step of selecting the certain transfer mode, further comprising the steps of:
selecting at least three endpoints for transferring image data corresponding to the selected transfer mode;
setting a transfer type for transferring the image data for each of the selected three endpoints;
after connecting to the external device, transferring the descriptor corresponding to the selected transfer mode to the external device;
receiving a run command on the selected transfer mode from the external device;
dividing the image data into a prescribed packet size and temporarily storing the divided image data in three temporary storage areas corresponding to the selected three endpoints; and
transferring the temporarily stored image data to the external device according to the set transfer type.
13. The method of claim 12, wherein, in the endpoint selection step, if the first mode is selected in the mode selection step, endpoints numbers 0 to 2 are selected; and if the second mode is selected, endpoints numbers 0, 2 and 3 are selected.
14. The method of claim 8 wherein the transmitting module is a universal serial bus interface.
15. A digital imaging apparatus for transferring image data being acquired through a camera to external device connected via a transmitting module, comprising:
a sub-storage unit for storing the image data being acquired through the camera;
a main storage unit for storing a firmware for a first mode wherein the image data being currently acquired through the camera is sent to the external device in a real-time stream and a firmware for a second mode wherein the image data stored in the sub-storage unit is sent to the external device;
a mode selection unit for applying a mode selection signal for a certain transfer mode of the first mode and the second mode; and
a controlling unit that allows identifying information of firmware corresponding to the selected certain mode to be transferred to the external device, if the mode selection signal is received from the mode selection unit and the transmitting module is connected to the external device.
16. The apparatus of claim 15, further comprising:
an on screen display (OSD) processing unit for OSD-processing a data transfer mode selection screen for selecting the certain transfer mode of the first mode and the second mode and outputs the result of processing;
a display unit for displaying the OSD-processed data transfer mode selection screen; and
a display request unit for generating a display request signal on the data transfer mode selection screen,
wherein if the display request signal is received from the display request unit, the controlling unit OSD-processes and displays the data transfer mode selection screen on the display unit, and the mode selection unit outputs the mode selection signal to the controlling unit by selecting the certain mode of the data transfer mode selection screen displayed on the display unit.
17. The apparatus of claim 15, wherein the transmitting module is applied with a universal serial bus interface.
18. The apparatus of claim 15, wherein if the identifying information of the firmware is transferred to the external device and the execution command for the selected certain mode is received from the external device, the controlling unit allows the firmware corresponding to the selected certain mode to be executed.
19. The apparatus of claim 15, wherein if the external device and the transmitting module are connected with each other, the sub-storage unit is recognized as an accessible movable disc in the external device.
20. A method for selecting data transfer mode of a digital imaging apparatus transferring image data being acquired through a camera to an external device connected via a transmitting module comprising the steps of:
storing the image data being acquired through the camera;
selecting a certain mode among a first mode wherein the image data being currently picked up through the camera is send to the external device in a real-time data streaming and a second mode wherein the image data that is stored in advance is send to the external device; and
if the certain mode is selected and the external device and the transmitting module are connected with each other to perform data communication, transferring the identifying information of firmware corresponding to the selected certain mode to the external device.
21. The method of claim 20, wherein, prior to the step of selecting the certain mode, further comprising the steps of:
generating a display request command for a data transfer mode selection screen selectable the certain mode of the first mode and the second mode; and
if the display request command is generated, on screen display (OSD)-processing the data transfer mode selection screen, thereby displaying the OSD-processed data transfer mode selection screen,
wherein the step of selecting the certain mode selects the certain mode from the OSD-processed data transfer mode selection screen.
22. The method of claim 20, wherein the transmitting module is applied with a universal serial bus interface.
23. The method of claim 20, further comprising the step of:
after the step of transferring the identifying information of the firmware to the external device, receiving the execution command for the selected certain mode from the external device; and
executing firmware corresponding to the selected certain mode.