FGC UGC White Paper

Embed Size (px)

Citation preview

  • 7/31/2019 FGC UGC White Paper

    1/22

  • 7/31/2019 FGC UGC White Paper

    2/22

    Making Media Move

    Introduction

    Can the use of the Internet redefine the notion of infield broadcasting? Can anyone with a

    cell phone or wireless card become a broadcaster? If the amount of people who can potentially

    cover a newsworthy event were to be able to coordinate their activities, what would that

    mean for event coverage? As everyone who has ever watched compelling news footage can

    attest, it is no longer pristine image quality that takes precedence over content as to whether

    or not the material will air. Today, more than ever, the race is on to get timely content ingested,

    reviewed, and distributed as quickly as possible.

    From Broadcast News to Breaking News

    There is an iconic moment in the 1987 film Broadcast News where actress Joan Cusack must

    deliver the videotape master of a justedited news story. What ensues is one of the funniest

    (and truest!) moments that is recognizable to anyone who has been in the broadcasting

    business. Cusack, undaunted, dodges everything from crowded halls to stationary water coolers

    as she rushes to get the tape into master control for airing. Finally, bruised and limping, she

    delivers the tape, makes the deadline, and then confidently strides away. Broadcast news

    indeed!

    Breaking news, as we have come to think of it, has typically been characterized by an

    interruption of regular programming in order to cover a live, unfolding event. Camera trucks are

    sent and live microwave shots are established. Naturally, there is some time delay involved

    between dispatching the trucks until the video feed becomes available. Of course, all that is

    now changing

    Capturing and Sending Images from the Field

    Today, the time element for putting images on the air is being dramatically affected by all forms

    of technology. The combination of portable cameras and stillimage and videocapable cell

    phones has created the possibility of a virtual extension of the broadcasting staff. The use of

    the open, public Internet along with these portable capture devices is rapidly augmenting and,

    ultimately, will redefine the definition of infield broadcasting. Today, anyone with a cell

    phone or a wireless card in a laptop can become a broadcaster.

    FieldGenerated versus UserGenerated Content

    There are usually two terms that are used, somewhat interchangeably, but for the purposes of

    this paper, it is helpful to refer to refer to fieldgenerated content (FGC) as being content that is

    created by professional staff members of an organization. Usergenerated content (UGC) will be

    used to describe content that is created by nonstaff memberstypically citizen journalists or

  • 7/31/2019 FGC UGC White Paper

    3/22

    Making Media Move

    public persons who have happened upon news and event worthy occurrences. While the

    distinction may seem unnecessary, it is important to distinguish between the two groups.

    Further, from a system design aspect, it is important to provide certain features and functions

    specific to each group. These distinctions will be outlined within this paper.

    Timeliness vs. Quality

    Regardless of the quality of the images, frame rate, composition or how steady or shaky, the

    nature of the content and how compelling it is will win out over any technical consideration.

    Gone, forever, are the days of being sticklers for quality. It is not that quality has been forsaken

    or that it is not valued, but the competition for viewers has never been fiercer. Further, the

    screen type, size, and locations that people are viewing content on and at are undergoing rapid

    change. Each of these distribution and consumption channels must be serviced.

    And, those distribution channels must be served fast. With so manypotentialpeople able to

    capture breaking news, broadcasters must develop strategies to handle this infield generated

    content. Whether it is a hurricane in Louisiana, an earthquake in Pakistan, or a citizen journalist

    that has captured footage of an accident, viewers have demonstrated that they will go where

    the content is and will watch it whether it is provided to them on television, on a computer

    screen, or on a mobile handset.

    Getting Footage in From the Field

    If you are a News Director, journalist, reporter, or videographer, you are all sharing a common

    desire to bring the most relevant content in the timeliest fashion to your audience. Youre

    scheduled in the morning after its decided what major stories are going to be covered.

    Naturally, covering predetermined events are an important aspect of any broadcast

    organization. But, when the event is not scheduled, whenindeedit is something that is

    rapidly unfolding, the staff gets dispatched to cover it. What are some of the most relevant

    issues that arise in trying to ingest, review, and distribute infield generated content and how is

    that impacted when its not your staff but a pool of citizen journalists equipped with

    camcorders and cell phones?

    Content Submission, Security, and Acceleration

    How will fieldgenerated content be submitted? The most sensible approach is to provide a

    method for electronic submission. This can be done by creating a website which can be

    specifically branded for each organization in order to maintain station identity. This is especially

    important whether we are talking about a 16station group or a network comprised of 220

    affiliates.

  • 7/31/2019 FGC UGC White Paper

    4/22

    Making Media Move

    4

    Next, it is likely that you will want to prompt the submitter for some metadata such as a name,

    contact telephone number, and to check the required legal release. Users are then prompted to

    choose the file and then upload it.

    Figure 1 is an example of a submission page along with prompts for metadata and file transfer

    acceleration.

    Figure. 1: A web page that is branded for a specific organization and prompts the user for

    metadata and provides for file transfer security and file transfer acceleration.

    Local Stations in Local Touch

    Even if we limit our scope to the U.S., we find that there are still 90 station groups, who range

    from 1 station (GH Broadcasting, Corpus Christi, Texas) to 60 stations (Ion Media Networks,

    West Palm Beach, Florida). Many of these station groups have already begun to create methods

    by which their journalists and Videographers as well as their viewers can upload content. While

    this process began with still photos sent via email by viewers, this practice now encompasses

  • 7/31/2019 FGC UGC White Paper

    5/22

    Making Media Move

    getting video content from the field in a fast and efficient process. Local television stations are

    clearly in pursuit of their greatest assetbeing local. Today, it is not enough to be a major

    network affiliateit is critical to be locally relevant to the viewing community served.

    Faced with the challenges of remaining local, how can a news organization receive content that

    is especially timely but without having a news van on site? Whether it is a member of the public

    or a staff member, one of the critical enablers of using fieldgenerated content is mobilizing it

    over the public Internet. This is also one of the challenges that must be addressed. For example,

    today, reporters are shooting in the field, cutting together stories on laptops, and going to

    Internet cafes to upload content. In this scenario, there are two important issues: security

    concerns and TCP/IPbased transfers.

    Security Takes On An Even Greater Role

    In a filebased workflow centric world, security takes on a greater role than ever before. First, a

    secure transmission is highly desired and should be established between the page in the web

    browser and the web server. This can easily be accomplished by using Transport Layer Security

    (TLS) or Secure Sockets Layer (SSL). These techniques provide for data security and data

    integrity in transferring data over TCP/IP networks. Regardless of whether the data is a credit

    card number or data that represents a video file, protecting the data in flight by securing the

    connection (the wire) via encrypted datagram is easily accomplished.

    Also, if it is necessary to provide additional data protection or to prove that the data has not

    been tampered with, two methods can be employed. First, the payloadin this case, the

    datacan be encrypted using any number of encryption methodologies. One of the mostcommon is AES at either 64,128, or 256bit key strength. Further, it may be required that the

    integrity of the data is protectedthat isthe actual bits must be verified as exactly the same

    from sender to receiver. One way of accomplishing this is to utilize a Secure Hash Algorithm

    (SHA) that essentially assigns a fixsized bit string to random blocks of data. In this way, blocks

    of data that are sent can be compared on the receiving end. If the hash value has been

    changed, the integrity of the file has been hamperedeither through defects or due to

    tampering.

    This level of security can appear to be, perhaps, overdesigned for the topic being discussed,

    but imagine a news director receiving a video interview of a witness of a tragic international

    incident or of a lawyer receiving a video testimony from a witness who cannot travel to a

    courtroom: it will often be necessary to prove that the literal bits of the file have not been

    tampered with. As the Media and Entertainment (M&E) industry increasingly adopts and

    transforms to a completely filebased workflow, a secure and verifiable electronic affidavit

    and certified delivery equivalents are necessary.

  • 7/31/2019 FGC UGC White Paper

    6/22

    Making Media Move

    Figure 2 shows the various security components and methods that can be employed in the

    transfer of digital files between sender and receiver.

    Figure 2. Authentication, SHA1 hashing, and media encryption techniques used to establish

    data confidentiality and integrity.

    The submission of content using the public Internet also brings with it both file transfer andsecurity issues. FTP (file transfer protocol) over TCP/IP and its positive acknowledgment

    requirement places a limit on how quickly content can be sent. It is also important to consider

    that use of the public Internet is an open and unsecured network that everyone is sharing. This

    means that content is in the clear and that there is also some level of communal sharing of the

    total available network bandwidth.

    There are, of course, several methods to accelerate the movement of content. One such

    implementation is in the form of a softwarebased WAN (wide area network) accelerator. By

    clicking on a button in the web page that engages this WAN acceleration technology, content

    can be uploaded approximately 60% faster over tradition methods such as FTP. WAN

    acceleration technology also favorably assists whenever considerable amounts of latency are

    introduced at the network level.

    Figure 3 shows the effects of latency on file transmission times.

  • 7/31/2019 FGC UGC White Paper

    7/22

    Making Media Move

    Figure 3. As network latency increases, file send times using FTP increases dramatically.

    Including both security and WAN acceleration technologies into the webbased upload process

    can also benefit from providing notification to both submitter and receiver that the content was

    uploaded and received. The field reporter then definitively knows that the content arrived, that

    the network connection can be closed, and can then move on to the next story and location.

    Bandwidth, Payload, and Latency

    In order to create a reliable system to receive FGC and UGC, it is necessary to understand the

    practical requirements with respect to the bandwidth that is available to the person attempting

    to upload content, the payload (the amount of data) and the latency of the IP network. These

    are important factors that must be taken into account in order for any submission system to

    actually operate.

    For example, it is a misconception that any IP network and transmission methodology will

    suffice in order to send content from the field into a facility. While it may appear that any

    network and protocol will work, there are simple and practical issues that must be understood.

    Often these are not taken into consideration.

    Lets explore some of the most common methods. It is, of course, quite common to simply

    attach a file to an email message. These transfers are typically accomplished using HTTP

  • 7/31/2019 FGC UGC White Paper

    8/22

    Making Media Move

    (hypertext transfer protocol). While not limited to TCP/IP, HTTP is the method by which most

    emailbased transfers occur. This transfer mechanism uses a requestresponse method wherein

    the client (e.g. the person uploading the content) sends data to the server. Because of this

    request/response process, as the latency of the network increases, the ability to send data is

    directly related to the time that it takes for sent data to be acknowledged.

    File Transfer Protocol (FTP) is a very typical method of transferring files between systems and it,

    too, must operate according to TCP/IP constructs and is therefore susceptible to the same delay

    in the request/response process. As latency increases, the amount of data that can be sent

    decreases over time.

    Further, sent data is at risk unless there is some form of protection in place. A common

    approach of protecting sent data is via checkpointing. Without this, if a file is being sent via FTP

    and a loss of network connectivity occurs, the entire file must be resent. Checkpointing

    enables systems to stay in synchronization, typically using time intervals or data transferredintervals. The maximum amount of data lost would therefore be equated to the set time

    interval or set data amount between the systems. These settings are completely arbitrary; for

    example, a common checkpointing interval is 32 KBytes (kilobytes).

    How Much Network is Really Required?

    One of the most important factors to understand in architecting and implementing a solution

    for transferring FGC and UGC content is to understand all the variables that will potentially

    affect the user experience. Specifically, one must take into account the following:

    Amount of data to be sent Network link speed Network latency Transport protocolThe cumulative effects of these four items will indicate how long it will take to receive files from

    the field. However, a common misconception is that any network connection will suffice,

    because, while that may be conceptually true, in practical implementations there are simply

    minimal speeds that must be in place for a viable FGC/UGC solution to provide acceptable and

    timely results.

    Let us examine the real effects of latency on a practical set of data. In Figure 4, we are

    attempting to send two hours of DV 25 (Digital Video at 25 megabits/sec [mbits/sec]) material.

    Network connectivity is a robust 500 mbits/sec. The resulting file size is 20.9 gigabytes (GB). An

    examination of the graphs and accompanying tables clearly indicates the beneficial aspects of

    WAN acceleration and latency compensation technology.

  • 7/31/2019 FGC UGC White Paper

    9/22

    Making Media Move

    Figure 4. With a robust network of 500 mbits/sec, WAN acceleration demonstrates clear

    benefits over TCP transport.

    For example, note that as latency increases, there is minimal effect on completion times when

    using WAN acceleration. At zero latency, the 20.9 GB file will take six minutes to reach its

    destination. At 264 milliseconds (ms) latencytypically for an intercontinental linkonly an

    additional 34 seconds are required. Contrast this to using TCP as the transport protocol, shown

    in the lower table. With zero latency, delivery times via WAN acceleration or TCP are the same,

    at six minutes. However, note the increase in delivery time when latency increases between 40

    72 ms (typical for a NY to LA link). At 264 ms, 28 hours are needed to send the file versus sixminutes and 34 seconds using WAN acceleration.

    However, let us now examine the results on transfers when network connectivity is lessened. In

    Figure 5, note that we have decreased the amount of DV 25 content to be sent to only five

    minutes for a total of 894 MB. Further, note that the effect of network latency is no longer a

    key factor. The reason for this is that the network link is simply not large enough to

  • 7/31/2019 FGC UGC White Paper

    10/22

    Making Media Move

    1

    accommodate an increased amount of data packets that could be sent using WAN acceleration

    techniques.

    Figure 5. Reducing the amount of data to be sent has no effect on transfer times with either

    WAN acceleration or standard TCP protocols because network connectivity is so low.

  • 7/31/2019 FGC UGC White Paper

    11/22

    Making Media Move

    1

    In the next example (Figure 6), the file size remains constant at 894 MB, but we have increased

    the network link to 1.5 mbits/sec. Again, note that there is no difference in the delivery times.

    Figure 6. The effects of latency are negligible with T1 connectivity.

    The Tale of the Hotel Room

    Continuing to refer to Figure 6, note that, while theoretically the delivery times are equal

    between WAN acceleration and TCP, what occurs in the real world is often very different. The

    following example is indicative of what you are very likely to experience when using software

    based WAN acceleration. Enticingly, this example is titled the tale of the hotel room and is

    based on a true story.

  • 7/31/2019 FGC UGC White Paper

    12/22

    Making Media Move

    1

    A person just finishes capturing an event and goes into a hotel lobby in Oslo, Norway. He tries

    to upload to a server in the United Kingdom using both FTP and Signiants Media Exchange

    application which includes softwarebased WAN acceleration technology. The hotel WIFI

    connection supports a 1 (one) megabit/second upload and download speed. There was over

    150 ms of latency from Oslo to London over this network link.

    Using FTP, the file uploads at just 60 kilobits / second, despite the 1 megabit / second capability

    of the network. The file is then resent, this time using Media Exchange which includes WAN

    acceleration and uploads at 900 kilobits / second, or 15 times faster. This true tale clearly

    indicates that slower networks (e.g. 1 mbit/sec) can be of use when latency is a factor. In this

    case, theoretical speeds did not matter and a key factor in being able to take advantage of even

    limited bandwidth (in this case 1 mbit / second) with WAN acceleration is, indeed, possible and

    achievable.

    Now, as we increase the capacity of the network link, we begin to see the positive effects ofWAN acceleration as latency reaches 72 ms and higher (Figure 7).

  • 7/31/2019 FGC UGC White Paper

    13/22

    Making Media Move

    1

    Figure 7. As both network capacity and latency increase, we begin to see the increase in delivery

    times using TCP transport at the 72 ms level.

    As the capacity of the network increases to 100 mbits/sec, notable improvements occur at even

    8 ms latency (Figure 8).

  • 7/31/2019 FGC UGC White Paper

    14/22

    Making Media Move

    1

    Figure 8. With network connectivity at 100 mbits/sec, WAN acceleration delivers consistent

    results while we begin to see increasing delivery times using TCP transport at the 8 ms level.

    Finally, as the network capacity increases to 155.5 mbits/sec (OC3), we can clearly see the

    benefits of WAN acceleration (Figure 9).

  • 7/31/2019 FGC UGC White Paper

    15/22

    Making Media Move

    1

    Figure 9. Here we see the classic flat lining of a consistent delivery time using WAN

    acceleration and the rapid reduction in sustained data transfer using TCP transport.

    Getting The File versus Not Getting It At All

    The preceding examples indicate that there is a level at which being able to upload content

    from the field is not affected by WAN acceleration techniques. As network capacity decreases

    and even as latency increases, the ability to utilize a network link to its complete capacity

    reaches a limit. Under these circumstances, it is critical that some form of checkpointing be

    implemented. Otherwise, a loss of network connectivityregardless of how momentary and as

    is often the case with data traffic using the Internetwill result in the need to resend the entire

  • 7/31/2019 FGC UGC White Paper

    16/22

    Making Media Move

    1

    file. Obviously, this is counterproductive to getting any content that is perceived will be in high

    demand.

    Regulating of Upload and Download Speeds

    Further impacting the ability to upload (or download) content from the field is the extent towhich the sender will have a sustained amount of bandwidth to utilize during the course of the

    transfer. If an organization has a dedicated link to the Internet which provides a guaranteed

    amount of mbits/sec, then that link can be apportioned according to the FGC / UGC needs.

    However, consider what could occur when content is being sent by either a professional staff

    member or by a citizen journalist over the open and insecure public Internet and utilizing

    connectivity to the Internet via an Internet Service Provider (ISP). This could be in many

    formats, naturallyperhaps mobile, cable modem, satelliteand so forth. However, when

    implementing a solution for FGC and UGC, it is important to consider that there have, indeed,

    been instances where an ISP may:

    Provide the agreedupon bandwidth for the length of the connection. Provide less than the agreeupon bandwidth for the length of the connection. Stepdown, that is, limit or regulate the transfer speed over a period of time.

    The third possibilitystepping down the transfer speedcan have particularly damaging

    effects to the goal of getting content in from the field. If this occurs, the importance of robust

    autorestart of an interrupted transfer, checkpointing, and resumption of a transfer from the

    point of interruption become critical items to have implemented in the solution. It is therefore

    important for one to be cognizant of these network provider issues.

    Protocol Failover

    Depending upon how content is being sent to your organization, it may be necessary to offer

    the ability to automatically handle different protocol requirements. For example, lets say that a

    person uploading FGC is doing so using a corporate provided network link. This means that the

    person uploading the content has been granted the ability to use UDP (User Datagram Protocol)

    WAN acceleration. However, there are, indeed, cases where FGC and UGC cannot utilize WAN

    acceleration.

    This often occurs as a result of a corporate policy wherein UDPbased protocols are not allowed

    to function on the corporate network. It may also occur due to the presence of a firewall or

    web proxy server between the uploader and the server processing the incoming data. In these

    cases, it may not be possible to send (or, for that matter receive) content in an accelerated

    fashion. Instead, it may only be possible to send content via TCP or via HTTP protocols.

  • 7/31/2019 FGC UGC White Paper

    17/22

    Making Media Move

    1

    In designing a system for processing FGC and UGC, one should employ the ability to

    automatically failover among transport protocols. In this manner, a transfer could be

    determined to begin as a UDP transfer and then automatically failover to TCP or to HTTP. Again,

    the key item is that the system should automatically provide this failover technology without

    human intervention. Shown in Figure 10 is a setup profile that indicates what automatic actionshould be taken as FGC and UGC transfers are processed. In this example, transfers are

    determined to begin as UDPbased and then will automatically failover to TCP and then to HTTP

    protocols.

    Figure 10. An example of a protocol setup screen which indicates automatic failover from UDP

    to TCP to HTTP transport methods.

    Reviewing the Footage and Deciding What Happens Next

    Now imagine the situation at the news desk when content is electronically submitted either by

    citizen journalists or by staff members. By building a webbased submission and review system,

    the news director can be electronically notified that content has arrived and needs to be

    examined. This can be an email that arrives at the desktop or on a mobile device. Embedding a

    link to the media into the email notification allows the reviewer to review footage by invoking

    the proper media player.

    Next, decisions must be made. Is the content relevant? If yes, to whom? What needs to happen

    to the content? Can it be sent as raw video? Or, before it airs, does it need to be edited and

    packaged in some way? Should the packaged piece of content then go to the six US New

    England affiliates, the 13 midAtlantic US affiliates or to everyaffiliate? How is this file

    movement coordinated?

    In Figure 1, you will note that in the content reviewer page a menu item shows how content

    can be routed to very specific locations. Behind the scenes, a workflow automation routine

    has been designed to carry out these content movement orders.

  • 7/31/2019 FGC UGC White Paper

    18/22

    Making Media Move

    1

    Viruses and Content Checking

    As anyone responsible for data integrity and corporate IT security will attest, it is important to

    interrogate any data that is being brought into the facility as a result of FGC and UGC

    endeavors. This may be as straightforward as temporarily quarantining the incoming content

    for an amount of time that it can be checked for viruses prior to being brought into the main

    network attached storage (NAS) in the facility.

    In addition, content type (format) and content naming conventions (what the file was labeled

    by the content uploader versus what the facility naming convention requires) are two common

    areas that often need to be addressed before the content can be utilized.

    The Importance of Workflow Automation

    Getting content in from the field is only one component of an entire workflow that must be

    accomplished. In fact, once content has been moved into either the newsroom or theproduction / postproduction facility, what ensues are multiple uses of the content that must

    be facilitated in the most efficient manner possible.

    All of the ensuing questions and decisions that occur the moment that the news director has

    reviewed a piece of fieldgenerated content and determined that it is newsworthy can benefit

    from workflow automation functionality. If the content needs to be routed to specific sets of

    affiliates, can it be done so automatically? Can the content be routed to a drop box or directly

    into, say, an affiliates playback server? What is the distribution system to facilitate this? Is it

    FTP? Do we need to accelerate these file transfers? Are we employing less expensive network

    connections and rather than using expensive satellite time, are we using the open, public

    Internet?

    Further, we mustnt forget that multiple distribution channels are critical and that the content

    will be needed by the development group in charge of placing the content on the web page and

    also must service the mobile partners. This content preparation processintelligent platform

    specific publishing, if you willcan all be automated. Resizing images, encoding, and

    distribution can all be part of a workflow automation set of routines.

    Figure 11 is an example of how a series of scripts can be linked together to form an automated

    workflow. In this example, at a specific timing interval a SOAP (Simple Object Access Protocol)

    message is generated which initiates a file transfer which then deposits the file(s) into a specific

    folder destination and then initiates and delivers an email notification of the event.

  • 7/31/2019 FGC UGC White Paper

    19/22

    Making Media Move

    1

    Figure 11. The creation of workflow: linking together scripts that perform specific tasks that can

    be sequenced together to create an automated process.

    Modeling Routine and Complex Workflows

    The ingest of content generated in the field and distributed to any number of venues and

    screen types (terrestrial, satellite, mobile, etc.) benefits greatly from tasks that can be

    sequenced and then can be automated. These workflows range from relatively simple tasks

    such as outlined in Figure 11 to routine tasks that fall into easily identifiable categories.

    Shown in Figure 12 are examples of categories that routinely need to be accomplished when

    media transactions occur. Transport is selfexplanatory. Aggregate refers to the need to

    aggregate content from one or from many sources (as would be the case with FGC or UGC). The

    reverse of Aggregate, of course would be an automatic Distribute mechanism. Embargo is an

    interesting capability as it refers to the distribution of content to one or more locations, but an

    imposed embargo of the content is applied until specific conditions are met (e.g date, time,

    etc.) and then content becomes available to the recipient. The Transport S3 workflow that is

    shown refers to the movement of content to an Amazon S3 (Simple Storage Service) cloud.

  • 7/31/2019 FGC UGC White Paper

    20/22

    Making Media Move

    2

    Finally, the Apple FCP Transfer to iTunes shows how content can be automatically delivered

    from an Apple Final Cut editing application directly to the Apple iTunes servers for content

    ingest.

    By using a sophisticated workflow modeling engine (WME), content oriented tasks can be

    sequenced and be invoked to automatically and programmatically move content according to

    preset rules and policies that dictate when content should be moved, what bandwidth

    utilization should be applied, and so

    forth.

    Figure 12. Various categories of workflows that represent routine tasks for an enterprise and

    which observe the rules and policies by which content flows to, from, and within an

    organization.

    From Aggregation to Delivery

    Workflow modeling systems must be flexible enough in order to accommodate and service the

    everchanging business models for todays media and entertainment industry. Further,

    relatively simple categories of workflows can be combined in order to create more complex

    workflows.

  • 7/31/2019 FGC UGC White Paper

    21/22

    Making Media Move

    2

    For example, if we use the Aggregate workflow shown in Figure 12 and combine it with the

    multistep and multiprocess workflow shown in Figure 13, we would accomplish an endtoend

    AggregationtoDistribution result. In this case, our Aggregate template would be used to

    process ingest of FGC or UGC.

    As part of this aggregation cycle, the content is then delivered to a folder for ingest into an

    editing system. Once the editing is complete, a Signiant transport call is made and the newly

    edited content is then delivered to a multistage series of processes. These processes are

    contained in the foundational group which indicates two transcode operations and a transfer to

    a geographically separated editing system. The resulting files are then delivered via Signiant

    agent technology, FTP, and resolved to the target editing system.

    Finally, the workflow calls for automatic email notifications upon successful task completion,

    notifications if exceptions are encountered, and database updating.

    Figure 13. A more complex workflow created using Signiants Workflow Modeling Engine.

    Shown are linked workflows with multiple processes, applications, and If, Else, and Exception

    actions. Editorial, transcoding, content database and multiple transport delivery protocols are

    all involved in this automated workflow.

    Evaluating Needs and Implementation

    A digital ingest, review, and distribution system to support fieldgenerated content should take

    into consideration three important areas: 1). A secure, WANaccelerated webbased upload and

    download capability. 2). An automated workflow framework to allow for flexibility of filebased

    workflows. 3). A centrally managed distribution system that provides routing, scheduling,

    network bandwidth utilization, auditing and reporting functionality. How much of your

    company bandwidth to use for any given transfer or set of transfers is, ultimately, a question of

    business priorities. Having proper controls to determine this is important.

  • 7/31/2019 FGC UGC White Paper

    22/22

    Making Media Move

    Figure 14 is an example of a dashboard view that provides realtime feedback of digital media

    transfers and bandwidth control functionality.

    Figure 14. A dashboard view that shows digital media transfers, locations of site transfers, and

    a network bandwidth management tools.

    Conclusion

    In todays competitive broadcast and media environment, it is now a reality that anyone with a

    camera, a phone, and an Internet connection can be a broadcaster. But there is a wide range of

    activities that may occur from the moment that an event is captured to the time and places

    that it is viewed. Getting breaking news on the air means getting it on the air over many

    device and screen types and demands the implementation of a coordinated file ingest, review,

    processing, packaging and distribution methodology. The technologies to accomplish thesetasks and to implement them have never been timelier.

    Signiant Content Distribution Management (CDM) software addresses these needs.

    Tom Ohanian is Chief Strategy Officer at Signiant. He is an Academy Award and twotime

    Emmy award recipient.