How Can I Process Untrusted Data Sources Securely? [closed]How to use RC4 securelyHow to securely set up wifi?Identify a process that uploads dataCan a virus infect source code files with dangerous data?Securely transmit sensitive dataHow can the data in a computer system become corrupted when it is copied across a network?How can I stop this remote attack?How can you transmit data without revealing your location?How can a management network be prevented from accessing network data?How useful is it to prevent Linux applications from setting up any untrusted connections, and can it be done easily?

How can I stop myself from micromanaging other PCs' actions?

What is the lowest-speed bogey a jet fighter can intercept/escort?

How can I tell if there was a power cut while I was out?

powerhouse of ideas

Is there anything wrong with Thrawn?

What is the meaning of "you has the wind of me"?

What is a leading question?

Where to place an artificial gland in the human body?

How can I make sure my players' decisions have consequences?

What to do when you reach a conclusion and find out later on that someone else already did?

Convert a string like 4h53m12s to a total number of seconds in JavaScript

Explanation for a joke about a three-legged dog that walks into a bar

How were the LM astronauts supported during the moon landing and ascent? What were the max G's on them during these phases?

What does "see" in "the Holy See" mean?

What should I say when a company asks you why someone (a friend) who was fired left?

Is it normal practice to screen share with a client?

Send a single HTML email from Thunderbird, overriding the default "plain text" setting

kids pooling money for Lego League and taxes

Is it correct to translate English noun adjuncts into adjectives?

Why does Mark say he hasn't had a shower for a year and a half?

How did C64 games handle music during gameplay?

3D Statue Park: U shapes

Is it legal to use cash pulled from a credit card to pay the monthly payment on that credit card?

Why are so many countries still in the Commonwealth?



How Can I Process Untrusted Data Sources Securely? [closed]


How to use RC4 securelyHow to securely set up wifi?Identify a process that uploads dataCan a virus infect source code files with dangerous data?Securely transmit sensitive dataHow can the data in a computer system become corrupted when it is copied across a network?How can I stop this remote attack?How can you transmit data without revealing your location?How can a management network be prevented from accessing network data?How useful is it to prevent Linux applications from setting up any untrusted connections, and can it be done easily?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








19















Here is the conundrum: At my current company, we process physical discs from numerous third party sources and extract the data from them to ingest to our own system. While we can generally trust the sources, we don't want to run the risk of introducing any form of malware in to our internal networks and systems. Is there any way that we can safely process these discs without too much (or any!) additional effort?



Our current process is:



  • Employee receives a disc and inserts it in to their workstation.

  • Employee extracts data from disc

  • Employee uploads extracted data to our internal system

Obviously, in the current format, if a compromised disc is inserted in to an employee's workstation, the entire network could potentially be infected within minutes. Not ideal.



One proposed solution was to use an air-gapped machine to inspect the disc before processing but this poses problems as then how can we reliably detect any (or new) malware on that machine? It also adds an additional, time-intensive step to the process as the discs would have to be extracted twice.



Another solution is to have a machine connected on an isolated subnet to our network, with an AV installed, and WAN access restricted to allow AV updates only. Discs can be inserted and extracted remotely on that machine from an employee's workstation and then the data ingested (somehow; perhaps a proxy?) to the system.



What would be the most secure, most cost effective, and least time wasting method of performing this operation? If there is a recommended industry standard, what is it and where can I read up on it?



EDIT:



The discs are DICOM compatible discs so they contain multiple images (.tiff or .dcm) but also (usually) a viewer application (a .exe) to view these images. The worry here is more that one of these files could contain a Trojan, I guess. Still quite junior with CyberSec so forgive me if I'm misunderstanding some aspects!










share|improve this question















closed as too broad by Steffen Ullrich, Ghedipunk, forest, MechMK1, Rory Alsop Jul 21 at 21:10


Please edit the question to limit it to a specific problem with enough detail to identify an adequate answer. Avoid asking multiple distinct questions at once. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.













  • 1





    There is no generic best way. There are some generic solutions with different levels of costs, usability and security (i.e. usually more security meaning less security and/or more costs) and what is best of these depends on your specific (and unknown) security, usability and cost requirements and how you are able to deal with remaining risks (unknown too). And there might be better ways specific to the kind of data you get from outside and how these need to be processed in-house. Only, these details are not provided in your question.

    – Steffen Ullrich
    Jul 16 at 13:42












  • Maybe have a look at TENS, This will let you look at the disk on a disconnected device, like a laptop. If deemed safe you can connect the device and transfer files. Though this is just one among several things you can do.

    – Artog
    Jul 16 at 14:17







  • 6





    What kind of disks are you talking about? CD/DVD/Blueray disks? Harddisks? Or something else?

    – Bergi
    Jul 16 at 21:55











  • With regards to "generally" trusting the sources of the data, this is probably not a good idea. The important thing here is that while its possible that the people sending you the disks are trustworthy, what you actually need to trust is the data on the disk. This means that if the people you are trusting had their security compromised, you can still get malicious data even if the sender is an angel with your best interests at heart.

    – DreamConspiracy
    Jul 17 at 9:34











  • For detecting malware, you could use a host-based intrusion detection system (HIDS) such as OSSEC. It detects changes to files, registry etc.

    – Fax
    Jul 18 at 8:38

















19















Here is the conundrum: At my current company, we process physical discs from numerous third party sources and extract the data from them to ingest to our own system. While we can generally trust the sources, we don't want to run the risk of introducing any form of malware in to our internal networks and systems. Is there any way that we can safely process these discs without too much (or any!) additional effort?



Our current process is:



  • Employee receives a disc and inserts it in to their workstation.

  • Employee extracts data from disc

  • Employee uploads extracted data to our internal system

Obviously, in the current format, if a compromised disc is inserted in to an employee's workstation, the entire network could potentially be infected within minutes. Not ideal.



One proposed solution was to use an air-gapped machine to inspect the disc before processing but this poses problems as then how can we reliably detect any (or new) malware on that machine? It also adds an additional, time-intensive step to the process as the discs would have to be extracted twice.



Another solution is to have a machine connected on an isolated subnet to our network, with an AV installed, and WAN access restricted to allow AV updates only. Discs can be inserted and extracted remotely on that machine from an employee's workstation and then the data ingested (somehow; perhaps a proxy?) to the system.



What would be the most secure, most cost effective, and least time wasting method of performing this operation? If there is a recommended industry standard, what is it and where can I read up on it?



EDIT:



The discs are DICOM compatible discs so they contain multiple images (.tiff or .dcm) but also (usually) a viewer application (a .exe) to view these images. The worry here is more that one of these files could contain a Trojan, I guess. Still quite junior with CyberSec so forgive me if I'm misunderstanding some aspects!










share|improve this question















closed as too broad by Steffen Ullrich, Ghedipunk, forest, MechMK1, Rory Alsop Jul 21 at 21:10


Please edit the question to limit it to a specific problem with enough detail to identify an adequate answer. Avoid asking multiple distinct questions at once. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.













  • 1





    There is no generic best way. There are some generic solutions with different levels of costs, usability and security (i.e. usually more security meaning less security and/or more costs) and what is best of these depends on your specific (and unknown) security, usability and cost requirements and how you are able to deal with remaining risks (unknown too). And there might be better ways specific to the kind of data you get from outside and how these need to be processed in-house. Only, these details are not provided in your question.

    – Steffen Ullrich
    Jul 16 at 13:42












  • Maybe have a look at TENS, This will let you look at the disk on a disconnected device, like a laptop. If deemed safe you can connect the device and transfer files. Though this is just one among several things you can do.

    – Artog
    Jul 16 at 14:17







  • 6





    What kind of disks are you talking about? CD/DVD/Blueray disks? Harddisks? Or something else?

    – Bergi
    Jul 16 at 21:55











  • With regards to "generally" trusting the sources of the data, this is probably not a good idea. The important thing here is that while its possible that the people sending you the disks are trustworthy, what you actually need to trust is the data on the disk. This means that if the people you are trusting had their security compromised, you can still get malicious data even if the sender is an angel with your best interests at heart.

    – DreamConspiracy
    Jul 17 at 9:34











  • For detecting malware, you could use a host-based intrusion detection system (HIDS) such as OSSEC. It detects changes to files, registry etc.

    – Fax
    Jul 18 at 8:38













19












19








19


2






Here is the conundrum: At my current company, we process physical discs from numerous third party sources and extract the data from them to ingest to our own system. While we can generally trust the sources, we don't want to run the risk of introducing any form of malware in to our internal networks and systems. Is there any way that we can safely process these discs without too much (or any!) additional effort?



Our current process is:



  • Employee receives a disc and inserts it in to their workstation.

  • Employee extracts data from disc

  • Employee uploads extracted data to our internal system

Obviously, in the current format, if a compromised disc is inserted in to an employee's workstation, the entire network could potentially be infected within minutes. Not ideal.



One proposed solution was to use an air-gapped machine to inspect the disc before processing but this poses problems as then how can we reliably detect any (or new) malware on that machine? It also adds an additional, time-intensive step to the process as the discs would have to be extracted twice.



Another solution is to have a machine connected on an isolated subnet to our network, with an AV installed, and WAN access restricted to allow AV updates only. Discs can be inserted and extracted remotely on that machine from an employee's workstation and then the data ingested (somehow; perhaps a proxy?) to the system.



What would be the most secure, most cost effective, and least time wasting method of performing this operation? If there is a recommended industry standard, what is it and where can I read up on it?



EDIT:



The discs are DICOM compatible discs so they contain multiple images (.tiff or .dcm) but also (usually) a viewer application (a .exe) to view these images. The worry here is more that one of these files could contain a Trojan, I guess. Still quite junior with CyberSec so forgive me if I'm misunderstanding some aspects!










share|improve this question
















Here is the conundrum: At my current company, we process physical discs from numerous third party sources and extract the data from them to ingest to our own system. While we can generally trust the sources, we don't want to run the risk of introducing any form of malware in to our internal networks and systems. Is there any way that we can safely process these discs without too much (or any!) additional effort?



Our current process is:



  • Employee receives a disc and inserts it in to their workstation.

  • Employee extracts data from disc

  • Employee uploads extracted data to our internal system

Obviously, in the current format, if a compromised disc is inserted in to an employee's workstation, the entire network could potentially be infected within minutes. Not ideal.



One proposed solution was to use an air-gapped machine to inspect the disc before processing but this poses problems as then how can we reliably detect any (or new) malware on that machine? It also adds an additional, time-intensive step to the process as the discs would have to be extracted twice.



Another solution is to have a machine connected on an isolated subnet to our network, with an AV installed, and WAN access restricted to allow AV updates only. Discs can be inserted and extracted remotely on that machine from an employee's workstation and then the data ingested (somehow; perhaps a proxy?) to the system.



What would be the most secure, most cost effective, and least time wasting method of performing this operation? If there is a recommended industry standard, what is it and where can I read up on it?



EDIT:



The discs are DICOM compatible discs so they contain multiple images (.tiff or .dcm) but also (usually) a viewer application (a .exe) to view these images. The worry here is more that one of these files could contain a Trojan, I guess. Still quite junior with CyberSec so forgive me if I'm misunderstanding some aspects!







network virus worm






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jul 16 at 15:16







Stuart H

















asked Jul 16 at 13:29









Stuart HStuart H

1961 silver badge6 bronze badges




1961 silver badge6 bronze badges




closed as too broad by Steffen Ullrich, Ghedipunk, forest, MechMK1, Rory Alsop Jul 21 at 21:10


Please edit the question to limit it to a specific problem with enough detail to identify an adequate answer. Avoid asking multiple distinct questions at once. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.









closed as too broad by Steffen Ullrich, Ghedipunk, forest, MechMK1, Rory Alsop Jul 21 at 21:10


Please edit the question to limit it to a specific problem with enough detail to identify an adequate answer. Avoid asking multiple distinct questions at once. See the How to Ask page for help clarifying this question. If this question can be reworded to fit the rules in the help center, please edit the question.









  • 1





    There is no generic best way. There are some generic solutions with different levels of costs, usability and security (i.e. usually more security meaning less security and/or more costs) and what is best of these depends on your specific (and unknown) security, usability and cost requirements and how you are able to deal with remaining risks (unknown too). And there might be better ways specific to the kind of data you get from outside and how these need to be processed in-house. Only, these details are not provided in your question.

    – Steffen Ullrich
    Jul 16 at 13:42












  • Maybe have a look at TENS, This will let you look at the disk on a disconnected device, like a laptop. If deemed safe you can connect the device and transfer files. Though this is just one among several things you can do.

    – Artog
    Jul 16 at 14:17







  • 6





    What kind of disks are you talking about? CD/DVD/Blueray disks? Harddisks? Or something else?

    – Bergi
    Jul 16 at 21:55











  • With regards to "generally" trusting the sources of the data, this is probably not a good idea. The important thing here is that while its possible that the people sending you the disks are trustworthy, what you actually need to trust is the data on the disk. This means that if the people you are trusting had their security compromised, you can still get malicious data even if the sender is an angel with your best interests at heart.

    – DreamConspiracy
    Jul 17 at 9:34











  • For detecting malware, you could use a host-based intrusion detection system (HIDS) such as OSSEC. It detects changes to files, registry etc.

    – Fax
    Jul 18 at 8:38












  • 1





    There is no generic best way. There are some generic solutions with different levels of costs, usability and security (i.e. usually more security meaning less security and/or more costs) and what is best of these depends on your specific (and unknown) security, usability and cost requirements and how you are able to deal with remaining risks (unknown too). And there might be better ways specific to the kind of data you get from outside and how these need to be processed in-house. Only, these details are not provided in your question.

    – Steffen Ullrich
    Jul 16 at 13:42












  • Maybe have a look at TENS, This will let you look at the disk on a disconnected device, like a laptop. If deemed safe you can connect the device and transfer files. Though this is just one among several things you can do.

    – Artog
    Jul 16 at 14:17







  • 6





    What kind of disks are you talking about? CD/DVD/Blueray disks? Harddisks? Or something else?

    – Bergi
    Jul 16 at 21:55











  • With regards to "generally" trusting the sources of the data, this is probably not a good idea. The important thing here is that while its possible that the people sending you the disks are trustworthy, what you actually need to trust is the data on the disk. This means that if the people you are trusting had their security compromised, you can still get malicious data even if the sender is an angel with your best interests at heart.

    – DreamConspiracy
    Jul 17 at 9:34











  • For detecting malware, you could use a host-based intrusion detection system (HIDS) such as OSSEC. It detects changes to files, registry etc.

    – Fax
    Jul 18 at 8:38







1




1





There is no generic best way. There are some generic solutions with different levels of costs, usability and security (i.e. usually more security meaning less security and/or more costs) and what is best of these depends on your specific (and unknown) security, usability and cost requirements and how you are able to deal with remaining risks (unknown too). And there might be better ways specific to the kind of data you get from outside and how these need to be processed in-house. Only, these details are not provided in your question.

– Steffen Ullrich
Jul 16 at 13:42






There is no generic best way. There are some generic solutions with different levels of costs, usability and security (i.e. usually more security meaning less security and/or more costs) and what is best of these depends on your specific (and unknown) security, usability and cost requirements and how you are able to deal with remaining risks (unknown too). And there might be better ways specific to the kind of data you get from outside and how these need to be processed in-house. Only, these details are not provided in your question.

– Steffen Ullrich
Jul 16 at 13:42














Maybe have a look at TENS, This will let you look at the disk on a disconnected device, like a laptop. If deemed safe you can connect the device and transfer files. Though this is just one among several things you can do.

– Artog
Jul 16 at 14:17






Maybe have a look at TENS, This will let you look at the disk on a disconnected device, like a laptop. If deemed safe you can connect the device and transfer files. Though this is just one among several things you can do.

– Artog
Jul 16 at 14:17





6




6





What kind of disks are you talking about? CD/DVD/Blueray disks? Harddisks? Or something else?

– Bergi
Jul 16 at 21:55





What kind of disks are you talking about? CD/DVD/Blueray disks? Harddisks? Or something else?

– Bergi
Jul 16 at 21:55













With regards to "generally" trusting the sources of the data, this is probably not a good idea. The important thing here is that while its possible that the people sending you the disks are trustworthy, what you actually need to trust is the data on the disk. This means that if the people you are trusting had their security compromised, you can still get malicious data even if the sender is an angel with your best interests at heart.

– DreamConspiracy
Jul 17 at 9:34





With regards to "generally" trusting the sources of the data, this is probably not a good idea. The important thing here is that while its possible that the people sending you the disks are trustworthy, what you actually need to trust is the data on the disk. This means that if the people you are trusting had their security compromised, you can still get malicious data even if the sender is an angel with your best interests at heart.

– DreamConspiracy
Jul 17 at 9:34













For detecting malware, you could use a host-based intrusion detection system (HIDS) such as OSSEC. It detects changes to files, registry etc.

– Fax
Jul 18 at 8:38





For detecting malware, you could use a host-based intrusion detection system (HIDS) such as OSSEC. It detects changes to files, registry etc.

– Fax
Jul 18 at 8:38










3 Answers
3






active

oldest

votes


















24














It all depends on what you actually do with the data. A bunch of bits sitting on a disk is just that: a bunch of bits. It needs to be somehow executed in order to become malware and pose a threat to your network.



This could be done in a couple of ways:



  • windows allows autorun on removable media. Mitigation: change ingestion machine to Linux or carefully configure windows

  • manual execution: employees can do it by accident. Mitigation: restrict employees account, for example, make the files read-only by default. This doesn’t protect from malicious actions, only from accidental ones.

  • Data on the disk can exploit a vulnerability in file system drivers. This is pretty unlikely, file system drivers are tested pretty well and any vulnerability here is critical and would be a very expensive 0-day (so it wouldn’t be used against you).
    Mitigation: you should automatically install security updates. Another way to protect against this is to use user-space file system drivers. AFAIK they are less well tested, less stable, and less performant, but any successful exploitation will give a user-level (and not kernel-level) access. You’d have to think about this trade-off by yourself.

  • Lastly – the data can exploit vulnerabilities in other software that you use to process the files. This is the most likely option, arbitrary code execution vulnerabilities are regularly discovered in software like pdf readers. Mitigation: keep your software updated, and configured to a high security (don’t allow macros in MS Office, etc.).

Additionally, you can contain every file processing step in a VM and reset it to the known good state after each workday. This would be pretty inconvenient but would offer some additional protection, especially if you disable networking on the VM.



I wouldn’t get my hopes up about antiviruses unless you’re actively executing supplied files they wouldn’t offer much protection.






share|improve this answer




















  • 1





    My first thought was a locked down VM environment also (which could be automated: initial expense to set up, but could be streamlined well)

    – jleach
    Jul 16 at 15:06






  • 1





    The VM idea occurred to me after I initially posted the question. An isolated VM without network access could be used to check the disk and, if deemed all clear, could then be connected to the host machine and the contents copied across that way. Only downside to this, as far as I can see, would be the staff training involved and the potential extra processing time.

    – Stuart H
    Jul 16 at 15:21






  • 3





    @StuartH the whole point is that you can’t “check” something and deem it “all clear”. It all depends on how you, your software and your OS would handle the data in question. Not executing any program from the disk gets you 95% towards secure, and the other 5% wouldn’t be covered by checking files with antiviruses.

    – Andrew Morozko
    Jul 16 at 15:36











  • Autorun can be entirely disabled by Windows Domain GPO. In a proper environment this is not a concern.

    – Nathan Goings
    Jul 16 at 23:13






  • 2





    @StuartH There are ways in which a program can check if it is being run inside a VM. So just because something runs fine in a VM without producing any sign of malign activity does not guarantee that it will do the same if run directly on the real hardware. You should really try to keep employees workstations in a limited subnetwork.

    – Giacomo Alzetta
    Jul 17 at 7:45


















21














I have experience securing DICOM in an identical situation so I'll focus on that.



Assuming you're using properly configured environment (Autorun disabled, frequently updated Antimalware, etc.) then CDs are relatively safe. The same cannot be said for USB drives. We used a burner PC for USB drives.



These discs are usually created by a PACS system (Picture archiving and communication system). Most PACS, when burning discs for external viewing use a proprietary software on top of the standard DICOM format, I've seen a handful that use a proprietary software and proprietary format that require you launching the software and "save as" DICOM—In those cases, we used the burner PC.



Otherwise, All standard DICOM formats have a DICOMDIR file that has the required file-system and DICOM metadata to extract all the "images" associated with it. We developed an extraction program that would run on disc mounting, read any DICOMDIR files, and extract only the DICOMDIR and images to a staging location. A records tech would then review the images with a third party viewing tool and identify that it was the records they expected and then send them to a peer for processing. This prevented any "Operating System" interactions with the data, severely reducing the attack surface.



The burner PC was a spare computer that had restricted network access. The tech would load the data, identify the source as safe, load the DICOM images into a program, and DICOM transmit them to a peer—where only the DICOM peer and AV updates were allowed on the restricted network.



Once the DICOM images were transmitted to the peer (using an AE_Title), they could send the images for printing, disc burning, or loading into a medical record on the main system.



The loading systems, the peer, and the main PACS were all segregated on the network and firewall. They could only talk to their respective whitelists, using their respective protocols (DICOM, Http, AV Updates, etc.).



The final risk mitigation was regular backups and backup verification, along with a comprehensive disaster recovery plan.



Summary



We limited the network devices that delt with DICOM images to only communicate with a whitelist and removed as much "Operating System" interactions as we could, leaving only highly specific—but required—protocols as risk factors. This covered a large attack surface leaving only our proprietary DICOM systems at direct risk. This was mitigated with proper network segregation, backups, and disaster recovery.



As for the burner PC, we would replace it or re-image it whenever it failed. Our official policy was to physically replace any machine with detected malware on it.



I had a great PACS Administrator who put up with our process and in return I supplied him with plenty of CD burner drive rubber-bands.




Edit: I'd like to point out, that the majority of DICOM images we received were from known entities. That is, patients would provide images from sister hospitals or imaging facilities that our diagnostics department had worked with—some even previously employed by. My biggest concern was generic worms (USB Autorun etc.) and not so much from proprietary software (PACS/DICOM) exploits. A proprietary exploit would indicate a targeted attack, usually from a business we had existing contracts with—a highly unlikely event.






share|improve this answer




















  • 2





    I assume the burner PC was regularly wiped and re-imaged? (Hence the name.) Could you add those policies to your already fantastic answer?

    – Jörg W Mittag
    Jul 17 at 5:24











  • @JörgWMittag, Added. Additionally, I clarified the fact that we received the majority of our images from known sources. It was a pain when new business would start sending us things—that usually meant a new proprietary software to figure out.

    – Nathan Goings
    Jul 17 at 14:07


















2














A dedicated machine on a separate VLAN with Internet access to update virus definitions. Then SFTP to a dedicated Linux VM that would also be running an AV instance and if you can help it possibly even several different AV's like ESET + CLAM for example. Then after it's scanned again and validated clean to push that data or have the data pulled into your main system. Every system should have antivirus and be configured properly and not excluding any of those items on the discs. Now because security is layered the rest of your infrastructure should be hardened and have GPO's in AD to help prevent people from doing things carelessly.



It's been my experience that when 3rd party business entities are sharing information then they have contracts drawn up for damages in the event that they accidentally infect you so from a legal perspective make sure you have protection on this going both directions. Having these contracts in place allows the flexibility to be able to trust the source unconditionally where you don't need all of the other precautions or security. Just remember either it's secure and users can't use it OR the users can use it but it's NOT secure. You can't have both, it's just a sliding scale between them. I've had some businesses push for security to the point that only a few people are able to do things while others want all users to run with Admin rights so they can do whatever they want until something catastrophic happens then they change their minds.






share|improve this answer































    3 Answers
    3






    active

    oldest

    votes








    3 Answers
    3






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    24














    It all depends on what you actually do with the data. A bunch of bits sitting on a disk is just that: a bunch of bits. It needs to be somehow executed in order to become malware and pose a threat to your network.



    This could be done in a couple of ways:



    • windows allows autorun on removable media. Mitigation: change ingestion machine to Linux or carefully configure windows

    • manual execution: employees can do it by accident. Mitigation: restrict employees account, for example, make the files read-only by default. This doesn’t protect from malicious actions, only from accidental ones.

    • Data on the disk can exploit a vulnerability in file system drivers. This is pretty unlikely, file system drivers are tested pretty well and any vulnerability here is critical and would be a very expensive 0-day (so it wouldn’t be used against you).
      Mitigation: you should automatically install security updates. Another way to protect against this is to use user-space file system drivers. AFAIK they are less well tested, less stable, and less performant, but any successful exploitation will give a user-level (and not kernel-level) access. You’d have to think about this trade-off by yourself.

    • Lastly – the data can exploit vulnerabilities in other software that you use to process the files. This is the most likely option, arbitrary code execution vulnerabilities are regularly discovered in software like pdf readers. Mitigation: keep your software updated, and configured to a high security (don’t allow macros in MS Office, etc.).

    Additionally, you can contain every file processing step in a VM and reset it to the known good state after each workday. This would be pretty inconvenient but would offer some additional protection, especially if you disable networking on the VM.



    I wouldn’t get my hopes up about antiviruses unless you’re actively executing supplied files they wouldn’t offer much protection.






    share|improve this answer




















    • 1





      My first thought was a locked down VM environment also (which could be automated: initial expense to set up, but could be streamlined well)

      – jleach
      Jul 16 at 15:06






    • 1





      The VM idea occurred to me after I initially posted the question. An isolated VM without network access could be used to check the disk and, if deemed all clear, could then be connected to the host machine and the contents copied across that way. Only downside to this, as far as I can see, would be the staff training involved and the potential extra processing time.

      – Stuart H
      Jul 16 at 15:21






    • 3





      @StuartH the whole point is that you can’t “check” something and deem it “all clear”. It all depends on how you, your software and your OS would handle the data in question. Not executing any program from the disk gets you 95% towards secure, and the other 5% wouldn’t be covered by checking files with antiviruses.

      – Andrew Morozko
      Jul 16 at 15:36











    • Autorun can be entirely disabled by Windows Domain GPO. In a proper environment this is not a concern.

      – Nathan Goings
      Jul 16 at 23:13






    • 2





      @StuartH There are ways in which a program can check if it is being run inside a VM. So just because something runs fine in a VM without producing any sign of malign activity does not guarantee that it will do the same if run directly on the real hardware. You should really try to keep employees workstations in a limited subnetwork.

      – Giacomo Alzetta
      Jul 17 at 7:45















    24














    It all depends on what you actually do with the data. A bunch of bits sitting on a disk is just that: a bunch of bits. It needs to be somehow executed in order to become malware and pose a threat to your network.



    This could be done in a couple of ways:



    • windows allows autorun on removable media. Mitigation: change ingestion machine to Linux or carefully configure windows

    • manual execution: employees can do it by accident. Mitigation: restrict employees account, for example, make the files read-only by default. This doesn’t protect from malicious actions, only from accidental ones.

    • Data on the disk can exploit a vulnerability in file system drivers. This is pretty unlikely, file system drivers are tested pretty well and any vulnerability here is critical and would be a very expensive 0-day (so it wouldn’t be used against you).
      Mitigation: you should automatically install security updates. Another way to protect against this is to use user-space file system drivers. AFAIK they are less well tested, less stable, and less performant, but any successful exploitation will give a user-level (and not kernel-level) access. You’d have to think about this trade-off by yourself.

    • Lastly – the data can exploit vulnerabilities in other software that you use to process the files. This is the most likely option, arbitrary code execution vulnerabilities are regularly discovered in software like pdf readers. Mitigation: keep your software updated, and configured to a high security (don’t allow macros in MS Office, etc.).

    Additionally, you can contain every file processing step in a VM and reset it to the known good state after each workday. This would be pretty inconvenient but would offer some additional protection, especially if you disable networking on the VM.



    I wouldn’t get my hopes up about antiviruses unless you’re actively executing supplied files they wouldn’t offer much protection.






    share|improve this answer




















    • 1





      My first thought was a locked down VM environment also (which could be automated: initial expense to set up, but could be streamlined well)

      – jleach
      Jul 16 at 15:06






    • 1





      The VM idea occurred to me after I initially posted the question. An isolated VM without network access could be used to check the disk and, if deemed all clear, could then be connected to the host machine and the contents copied across that way. Only downside to this, as far as I can see, would be the staff training involved and the potential extra processing time.

      – Stuart H
      Jul 16 at 15:21






    • 3





      @StuartH the whole point is that you can’t “check” something and deem it “all clear”. It all depends on how you, your software and your OS would handle the data in question. Not executing any program from the disk gets you 95% towards secure, and the other 5% wouldn’t be covered by checking files with antiviruses.

      – Andrew Morozko
      Jul 16 at 15:36











    • Autorun can be entirely disabled by Windows Domain GPO. In a proper environment this is not a concern.

      – Nathan Goings
      Jul 16 at 23:13






    • 2





      @StuartH There are ways in which a program can check if it is being run inside a VM. So just because something runs fine in a VM without producing any sign of malign activity does not guarantee that it will do the same if run directly on the real hardware. You should really try to keep employees workstations in a limited subnetwork.

      – Giacomo Alzetta
      Jul 17 at 7:45













    24












    24








    24







    It all depends on what you actually do with the data. A bunch of bits sitting on a disk is just that: a bunch of bits. It needs to be somehow executed in order to become malware and pose a threat to your network.



    This could be done in a couple of ways:



    • windows allows autorun on removable media. Mitigation: change ingestion machine to Linux or carefully configure windows

    • manual execution: employees can do it by accident. Mitigation: restrict employees account, for example, make the files read-only by default. This doesn’t protect from malicious actions, only from accidental ones.

    • Data on the disk can exploit a vulnerability in file system drivers. This is pretty unlikely, file system drivers are tested pretty well and any vulnerability here is critical and would be a very expensive 0-day (so it wouldn’t be used against you).
      Mitigation: you should automatically install security updates. Another way to protect against this is to use user-space file system drivers. AFAIK they are less well tested, less stable, and less performant, but any successful exploitation will give a user-level (and not kernel-level) access. You’d have to think about this trade-off by yourself.

    • Lastly – the data can exploit vulnerabilities in other software that you use to process the files. This is the most likely option, arbitrary code execution vulnerabilities are regularly discovered in software like pdf readers. Mitigation: keep your software updated, and configured to a high security (don’t allow macros in MS Office, etc.).

    Additionally, you can contain every file processing step in a VM and reset it to the known good state after each workday. This would be pretty inconvenient but would offer some additional protection, especially if you disable networking on the VM.



    I wouldn’t get my hopes up about antiviruses unless you’re actively executing supplied files they wouldn’t offer much protection.






    share|improve this answer















    It all depends on what you actually do with the data. A bunch of bits sitting on a disk is just that: a bunch of bits. It needs to be somehow executed in order to become malware and pose a threat to your network.



    This could be done in a couple of ways:



    • windows allows autorun on removable media. Mitigation: change ingestion machine to Linux or carefully configure windows

    • manual execution: employees can do it by accident. Mitigation: restrict employees account, for example, make the files read-only by default. This doesn’t protect from malicious actions, only from accidental ones.

    • Data on the disk can exploit a vulnerability in file system drivers. This is pretty unlikely, file system drivers are tested pretty well and any vulnerability here is critical and would be a very expensive 0-day (so it wouldn’t be used against you).
      Mitigation: you should automatically install security updates. Another way to protect against this is to use user-space file system drivers. AFAIK they are less well tested, less stable, and less performant, but any successful exploitation will give a user-level (and not kernel-level) access. You’d have to think about this trade-off by yourself.

    • Lastly – the data can exploit vulnerabilities in other software that you use to process the files. This is the most likely option, arbitrary code execution vulnerabilities are regularly discovered in software like pdf readers. Mitigation: keep your software updated, and configured to a high security (don’t allow macros in MS Office, etc.).

    Additionally, you can contain every file processing step in a VM and reset it to the known good state after each workday. This would be pretty inconvenient but would offer some additional protection, especially if you disable networking on the VM.



    I wouldn’t get my hopes up about antiviruses unless you’re actively executing supplied files they wouldn’t offer much protection.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Jul 16 at 14:40

























    answered Jul 16 at 14:28









    Andrew MorozkoAndrew Morozko

    1,6293 silver badges8 bronze badges




    1,6293 silver badges8 bronze badges







    • 1





      My first thought was a locked down VM environment also (which could be automated: initial expense to set up, but could be streamlined well)

      – jleach
      Jul 16 at 15:06






    • 1





      The VM idea occurred to me after I initially posted the question. An isolated VM without network access could be used to check the disk and, if deemed all clear, could then be connected to the host machine and the contents copied across that way. Only downside to this, as far as I can see, would be the staff training involved and the potential extra processing time.

      – Stuart H
      Jul 16 at 15:21






    • 3





      @StuartH the whole point is that you can’t “check” something and deem it “all clear”. It all depends on how you, your software and your OS would handle the data in question. Not executing any program from the disk gets you 95% towards secure, and the other 5% wouldn’t be covered by checking files with antiviruses.

      – Andrew Morozko
      Jul 16 at 15:36











    • Autorun can be entirely disabled by Windows Domain GPO. In a proper environment this is not a concern.

      – Nathan Goings
      Jul 16 at 23:13






    • 2





      @StuartH There are ways in which a program can check if it is being run inside a VM. So just because something runs fine in a VM without producing any sign of malign activity does not guarantee that it will do the same if run directly on the real hardware. You should really try to keep employees workstations in a limited subnetwork.

      – Giacomo Alzetta
      Jul 17 at 7:45












    • 1





      My first thought was a locked down VM environment also (which could be automated: initial expense to set up, but could be streamlined well)

      – jleach
      Jul 16 at 15:06






    • 1





      The VM idea occurred to me after I initially posted the question. An isolated VM without network access could be used to check the disk and, if deemed all clear, could then be connected to the host machine and the contents copied across that way. Only downside to this, as far as I can see, would be the staff training involved and the potential extra processing time.

      – Stuart H
      Jul 16 at 15:21






    • 3





      @StuartH the whole point is that you can’t “check” something and deem it “all clear”. It all depends on how you, your software and your OS would handle the data in question. Not executing any program from the disk gets you 95% towards secure, and the other 5% wouldn’t be covered by checking files with antiviruses.

      – Andrew Morozko
      Jul 16 at 15:36











    • Autorun can be entirely disabled by Windows Domain GPO. In a proper environment this is not a concern.

      – Nathan Goings
      Jul 16 at 23:13






    • 2





      @StuartH There are ways in which a program can check if it is being run inside a VM. So just because something runs fine in a VM without producing any sign of malign activity does not guarantee that it will do the same if run directly on the real hardware. You should really try to keep employees workstations in a limited subnetwork.

      – Giacomo Alzetta
      Jul 17 at 7:45







    1




    1





    My first thought was a locked down VM environment also (which could be automated: initial expense to set up, but could be streamlined well)

    – jleach
    Jul 16 at 15:06





    My first thought was a locked down VM environment also (which could be automated: initial expense to set up, but could be streamlined well)

    – jleach
    Jul 16 at 15:06




    1




    1





    The VM idea occurred to me after I initially posted the question. An isolated VM without network access could be used to check the disk and, if deemed all clear, could then be connected to the host machine and the contents copied across that way. Only downside to this, as far as I can see, would be the staff training involved and the potential extra processing time.

    – Stuart H
    Jul 16 at 15:21





    The VM idea occurred to me after I initially posted the question. An isolated VM without network access could be used to check the disk and, if deemed all clear, could then be connected to the host machine and the contents copied across that way. Only downside to this, as far as I can see, would be the staff training involved and the potential extra processing time.

    – Stuart H
    Jul 16 at 15:21




    3




    3





    @StuartH the whole point is that you can’t “check” something and deem it “all clear”. It all depends on how you, your software and your OS would handle the data in question. Not executing any program from the disk gets you 95% towards secure, and the other 5% wouldn’t be covered by checking files with antiviruses.

    – Andrew Morozko
    Jul 16 at 15:36





    @StuartH the whole point is that you can’t “check” something and deem it “all clear”. It all depends on how you, your software and your OS would handle the data in question. Not executing any program from the disk gets you 95% towards secure, and the other 5% wouldn’t be covered by checking files with antiviruses.

    – Andrew Morozko
    Jul 16 at 15:36













    Autorun can be entirely disabled by Windows Domain GPO. In a proper environment this is not a concern.

    – Nathan Goings
    Jul 16 at 23:13





    Autorun can be entirely disabled by Windows Domain GPO. In a proper environment this is not a concern.

    – Nathan Goings
    Jul 16 at 23:13




    2




    2





    @StuartH There are ways in which a program can check if it is being run inside a VM. So just because something runs fine in a VM without producing any sign of malign activity does not guarantee that it will do the same if run directly on the real hardware. You should really try to keep employees workstations in a limited subnetwork.

    – Giacomo Alzetta
    Jul 17 at 7:45





    @StuartH There are ways in which a program can check if it is being run inside a VM. So just because something runs fine in a VM without producing any sign of malign activity does not guarantee that it will do the same if run directly on the real hardware. You should really try to keep employees workstations in a limited subnetwork.

    – Giacomo Alzetta
    Jul 17 at 7:45













    21














    I have experience securing DICOM in an identical situation so I'll focus on that.



    Assuming you're using properly configured environment (Autorun disabled, frequently updated Antimalware, etc.) then CDs are relatively safe. The same cannot be said for USB drives. We used a burner PC for USB drives.



    These discs are usually created by a PACS system (Picture archiving and communication system). Most PACS, when burning discs for external viewing use a proprietary software on top of the standard DICOM format, I've seen a handful that use a proprietary software and proprietary format that require you launching the software and "save as" DICOM—In those cases, we used the burner PC.



    Otherwise, All standard DICOM formats have a DICOMDIR file that has the required file-system and DICOM metadata to extract all the "images" associated with it. We developed an extraction program that would run on disc mounting, read any DICOMDIR files, and extract only the DICOMDIR and images to a staging location. A records tech would then review the images with a third party viewing tool and identify that it was the records they expected and then send them to a peer for processing. This prevented any "Operating System" interactions with the data, severely reducing the attack surface.



    The burner PC was a spare computer that had restricted network access. The tech would load the data, identify the source as safe, load the DICOM images into a program, and DICOM transmit them to a peer—where only the DICOM peer and AV updates were allowed on the restricted network.



    Once the DICOM images were transmitted to the peer (using an AE_Title), they could send the images for printing, disc burning, or loading into a medical record on the main system.



    The loading systems, the peer, and the main PACS were all segregated on the network and firewall. They could only talk to their respective whitelists, using their respective protocols (DICOM, Http, AV Updates, etc.).



    The final risk mitigation was regular backups and backup verification, along with a comprehensive disaster recovery plan.



    Summary



    We limited the network devices that delt with DICOM images to only communicate with a whitelist and removed as much "Operating System" interactions as we could, leaving only highly specific—but required—protocols as risk factors. This covered a large attack surface leaving only our proprietary DICOM systems at direct risk. This was mitigated with proper network segregation, backups, and disaster recovery.



    As for the burner PC, we would replace it or re-image it whenever it failed. Our official policy was to physically replace any machine with detected malware on it.



    I had a great PACS Administrator who put up with our process and in return I supplied him with plenty of CD burner drive rubber-bands.




    Edit: I'd like to point out, that the majority of DICOM images we received were from known entities. That is, patients would provide images from sister hospitals or imaging facilities that our diagnostics department had worked with—some even previously employed by. My biggest concern was generic worms (USB Autorun etc.) and not so much from proprietary software (PACS/DICOM) exploits. A proprietary exploit would indicate a targeted attack, usually from a business we had existing contracts with—a highly unlikely event.






    share|improve this answer




















    • 2





      I assume the burner PC was regularly wiped and re-imaged? (Hence the name.) Could you add those policies to your already fantastic answer?

      – Jörg W Mittag
      Jul 17 at 5:24











    • @JörgWMittag, Added. Additionally, I clarified the fact that we received the majority of our images from known sources. It was a pain when new business would start sending us things—that usually meant a new proprietary software to figure out.

      – Nathan Goings
      Jul 17 at 14:07















    21














    I have experience securing DICOM in an identical situation so I'll focus on that.



    Assuming you're using properly configured environment (Autorun disabled, frequently updated Antimalware, etc.) then CDs are relatively safe. The same cannot be said for USB drives. We used a burner PC for USB drives.



    These discs are usually created by a PACS system (Picture archiving and communication system). Most PACS, when burning discs for external viewing use a proprietary software on top of the standard DICOM format, I've seen a handful that use a proprietary software and proprietary format that require you launching the software and "save as" DICOM—In those cases, we used the burner PC.



    Otherwise, All standard DICOM formats have a DICOMDIR file that has the required file-system and DICOM metadata to extract all the "images" associated with it. We developed an extraction program that would run on disc mounting, read any DICOMDIR files, and extract only the DICOMDIR and images to a staging location. A records tech would then review the images with a third party viewing tool and identify that it was the records they expected and then send them to a peer for processing. This prevented any "Operating System" interactions with the data, severely reducing the attack surface.



    The burner PC was a spare computer that had restricted network access. The tech would load the data, identify the source as safe, load the DICOM images into a program, and DICOM transmit them to a peer—where only the DICOM peer and AV updates were allowed on the restricted network.



    Once the DICOM images were transmitted to the peer (using an AE_Title), they could send the images for printing, disc burning, or loading into a medical record on the main system.



    The loading systems, the peer, and the main PACS were all segregated on the network and firewall. They could only talk to their respective whitelists, using their respective protocols (DICOM, Http, AV Updates, etc.).



    The final risk mitigation was regular backups and backup verification, along with a comprehensive disaster recovery plan.



    Summary



    We limited the network devices that delt with DICOM images to only communicate with a whitelist and removed as much "Operating System" interactions as we could, leaving only highly specific—but required—protocols as risk factors. This covered a large attack surface leaving only our proprietary DICOM systems at direct risk. This was mitigated with proper network segregation, backups, and disaster recovery.



    As for the burner PC, we would replace it or re-image it whenever it failed. Our official policy was to physically replace any machine with detected malware on it.



    I had a great PACS Administrator who put up with our process and in return I supplied him with plenty of CD burner drive rubber-bands.




    Edit: I'd like to point out, that the majority of DICOM images we received were from known entities. That is, patients would provide images from sister hospitals or imaging facilities that our diagnostics department had worked with—some even previously employed by. My biggest concern was generic worms (USB Autorun etc.) and not so much from proprietary software (PACS/DICOM) exploits. A proprietary exploit would indicate a targeted attack, usually from a business we had existing contracts with—a highly unlikely event.






    share|improve this answer




















    • 2





      I assume the burner PC was regularly wiped and re-imaged? (Hence the name.) Could you add those policies to your already fantastic answer?

      – Jörg W Mittag
      Jul 17 at 5:24











    • @JörgWMittag, Added. Additionally, I clarified the fact that we received the majority of our images from known sources. It was a pain when new business would start sending us things—that usually meant a new proprietary software to figure out.

      – Nathan Goings
      Jul 17 at 14:07













    21












    21








    21







    I have experience securing DICOM in an identical situation so I'll focus on that.



    Assuming you're using properly configured environment (Autorun disabled, frequently updated Antimalware, etc.) then CDs are relatively safe. The same cannot be said for USB drives. We used a burner PC for USB drives.



    These discs are usually created by a PACS system (Picture archiving and communication system). Most PACS, when burning discs for external viewing use a proprietary software on top of the standard DICOM format, I've seen a handful that use a proprietary software and proprietary format that require you launching the software and "save as" DICOM—In those cases, we used the burner PC.



    Otherwise, All standard DICOM formats have a DICOMDIR file that has the required file-system and DICOM metadata to extract all the "images" associated with it. We developed an extraction program that would run on disc mounting, read any DICOMDIR files, and extract only the DICOMDIR and images to a staging location. A records tech would then review the images with a third party viewing tool and identify that it was the records they expected and then send them to a peer for processing. This prevented any "Operating System" interactions with the data, severely reducing the attack surface.



    The burner PC was a spare computer that had restricted network access. The tech would load the data, identify the source as safe, load the DICOM images into a program, and DICOM transmit them to a peer—where only the DICOM peer and AV updates were allowed on the restricted network.



    Once the DICOM images were transmitted to the peer (using an AE_Title), they could send the images for printing, disc burning, or loading into a medical record on the main system.



    The loading systems, the peer, and the main PACS were all segregated on the network and firewall. They could only talk to their respective whitelists, using their respective protocols (DICOM, Http, AV Updates, etc.).



    The final risk mitigation was regular backups and backup verification, along with a comprehensive disaster recovery plan.



    Summary



    We limited the network devices that delt with DICOM images to only communicate with a whitelist and removed as much "Operating System" interactions as we could, leaving only highly specific—but required—protocols as risk factors. This covered a large attack surface leaving only our proprietary DICOM systems at direct risk. This was mitigated with proper network segregation, backups, and disaster recovery.



    As for the burner PC, we would replace it or re-image it whenever it failed. Our official policy was to physically replace any machine with detected malware on it.



    I had a great PACS Administrator who put up with our process and in return I supplied him with plenty of CD burner drive rubber-bands.




    Edit: I'd like to point out, that the majority of DICOM images we received were from known entities. That is, patients would provide images from sister hospitals or imaging facilities that our diagnostics department had worked with—some even previously employed by. My biggest concern was generic worms (USB Autorun etc.) and not so much from proprietary software (PACS/DICOM) exploits. A proprietary exploit would indicate a targeted attack, usually from a business we had existing contracts with—a highly unlikely event.






    share|improve this answer















    I have experience securing DICOM in an identical situation so I'll focus on that.



    Assuming you're using properly configured environment (Autorun disabled, frequently updated Antimalware, etc.) then CDs are relatively safe. The same cannot be said for USB drives. We used a burner PC for USB drives.



    These discs are usually created by a PACS system (Picture archiving and communication system). Most PACS, when burning discs for external viewing use a proprietary software on top of the standard DICOM format, I've seen a handful that use a proprietary software and proprietary format that require you launching the software and "save as" DICOM—In those cases, we used the burner PC.



    Otherwise, All standard DICOM formats have a DICOMDIR file that has the required file-system and DICOM metadata to extract all the "images" associated with it. We developed an extraction program that would run on disc mounting, read any DICOMDIR files, and extract only the DICOMDIR and images to a staging location. A records tech would then review the images with a third party viewing tool and identify that it was the records they expected and then send them to a peer for processing. This prevented any "Operating System" interactions with the data, severely reducing the attack surface.



    The burner PC was a spare computer that had restricted network access. The tech would load the data, identify the source as safe, load the DICOM images into a program, and DICOM transmit them to a peer—where only the DICOM peer and AV updates were allowed on the restricted network.



    Once the DICOM images were transmitted to the peer (using an AE_Title), they could send the images for printing, disc burning, or loading into a medical record on the main system.



    The loading systems, the peer, and the main PACS were all segregated on the network and firewall. They could only talk to their respective whitelists, using their respective protocols (DICOM, Http, AV Updates, etc.).



    The final risk mitigation was regular backups and backup verification, along with a comprehensive disaster recovery plan.



    Summary



    We limited the network devices that delt with DICOM images to only communicate with a whitelist and removed as much "Operating System" interactions as we could, leaving only highly specific—but required—protocols as risk factors. This covered a large attack surface leaving only our proprietary DICOM systems at direct risk. This was mitigated with proper network segregation, backups, and disaster recovery.



    As for the burner PC, we would replace it or re-image it whenever it failed. Our official policy was to physically replace any machine with detected malware on it.



    I had a great PACS Administrator who put up with our process and in return I supplied him with plenty of CD burner drive rubber-bands.




    Edit: I'd like to point out, that the majority of DICOM images we received were from known entities. That is, patients would provide images from sister hospitals or imaging facilities that our diagnostics department had worked with—some even previously employed by. My biggest concern was generic worms (USB Autorun etc.) and not so much from proprietary software (PACS/DICOM) exploits. A proprietary exploit would indicate a targeted attack, usually from a business we had existing contracts with—a highly unlikely event.







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Jul 17 at 14:05

























    answered Jul 16 at 23:41









    Nathan GoingsNathan Goings

    7634 silver badges11 bronze badges




    7634 silver badges11 bronze badges







    • 2





      I assume the burner PC was regularly wiped and re-imaged? (Hence the name.) Could you add those policies to your already fantastic answer?

      – Jörg W Mittag
      Jul 17 at 5:24











    • @JörgWMittag, Added. Additionally, I clarified the fact that we received the majority of our images from known sources. It was a pain when new business would start sending us things—that usually meant a new proprietary software to figure out.

      – Nathan Goings
      Jul 17 at 14:07












    • 2





      I assume the burner PC was regularly wiped and re-imaged? (Hence the name.) Could you add those policies to your already fantastic answer?

      – Jörg W Mittag
      Jul 17 at 5:24











    • @JörgWMittag, Added. Additionally, I clarified the fact that we received the majority of our images from known sources. It was a pain when new business would start sending us things—that usually meant a new proprietary software to figure out.

      – Nathan Goings
      Jul 17 at 14:07







    2




    2





    I assume the burner PC was regularly wiped and re-imaged? (Hence the name.) Could you add those policies to your already fantastic answer?

    – Jörg W Mittag
    Jul 17 at 5:24





    I assume the burner PC was regularly wiped and re-imaged? (Hence the name.) Could you add those policies to your already fantastic answer?

    – Jörg W Mittag
    Jul 17 at 5:24













    @JörgWMittag, Added. Additionally, I clarified the fact that we received the majority of our images from known sources. It was a pain when new business would start sending us things—that usually meant a new proprietary software to figure out.

    – Nathan Goings
    Jul 17 at 14:07





    @JörgWMittag, Added. Additionally, I clarified the fact that we received the majority of our images from known sources. It was a pain when new business would start sending us things—that usually meant a new proprietary software to figure out.

    – Nathan Goings
    Jul 17 at 14:07











    2














    A dedicated machine on a separate VLAN with Internet access to update virus definitions. Then SFTP to a dedicated Linux VM that would also be running an AV instance and if you can help it possibly even several different AV's like ESET + CLAM for example. Then after it's scanned again and validated clean to push that data or have the data pulled into your main system. Every system should have antivirus and be configured properly and not excluding any of those items on the discs. Now because security is layered the rest of your infrastructure should be hardened and have GPO's in AD to help prevent people from doing things carelessly.



    It's been my experience that when 3rd party business entities are sharing information then they have contracts drawn up for damages in the event that they accidentally infect you so from a legal perspective make sure you have protection on this going both directions. Having these contracts in place allows the flexibility to be able to trust the source unconditionally where you don't need all of the other precautions or security. Just remember either it's secure and users can't use it OR the users can use it but it's NOT secure. You can't have both, it's just a sliding scale between them. I've had some businesses push for security to the point that only a few people are able to do things while others want all users to run with Admin rights so they can do whatever they want until something catastrophic happens then they change their minds.






    share|improve this answer



























      2














      A dedicated machine on a separate VLAN with Internet access to update virus definitions. Then SFTP to a dedicated Linux VM that would also be running an AV instance and if you can help it possibly even several different AV's like ESET + CLAM for example. Then after it's scanned again and validated clean to push that data or have the data pulled into your main system. Every system should have antivirus and be configured properly and not excluding any of those items on the discs. Now because security is layered the rest of your infrastructure should be hardened and have GPO's in AD to help prevent people from doing things carelessly.



      It's been my experience that when 3rd party business entities are sharing information then they have contracts drawn up for damages in the event that they accidentally infect you so from a legal perspective make sure you have protection on this going both directions. Having these contracts in place allows the flexibility to be able to trust the source unconditionally where you don't need all of the other precautions or security. Just remember either it's secure and users can't use it OR the users can use it but it's NOT secure. You can't have both, it's just a sliding scale between them. I've had some businesses push for security to the point that only a few people are able to do things while others want all users to run with Admin rights so they can do whatever they want until something catastrophic happens then they change their minds.






      share|improve this answer

























        2












        2








        2







        A dedicated machine on a separate VLAN with Internet access to update virus definitions. Then SFTP to a dedicated Linux VM that would also be running an AV instance and if you can help it possibly even several different AV's like ESET + CLAM for example. Then after it's scanned again and validated clean to push that data or have the data pulled into your main system. Every system should have antivirus and be configured properly and not excluding any of those items on the discs. Now because security is layered the rest of your infrastructure should be hardened and have GPO's in AD to help prevent people from doing things carelessly.



        It's been my experience that when 3rd party business entities are sharing information then they have contracts drawn up for damages in the event that they accidentally infect you so from a legal perspective make sure you have protection on this going both directions. Having these contracts in place allows the flexibility to be able to trust the source unconditionally where you don't need all of the other precautions or security. Just remember either it's secure and users can't use it OR the users can use it but it's NOT secure. You can't have both, it's just a sliding scale between them. I've had some businesses push for security to the point that only a few people are able to do things while others want all users to run with Admin rights so they can do whatever they want until something catastrophic happens then they change their minds.






        share|improve this answer













        A dedicated machine on a separate VLAN with Internet access to update virus definitions. Then SFTP to a dedicated Linux VM that would also be running an AV instance and if you can help it possibly even several different AV's like ESET + CLAM for example. Then after it's scanned again and validated clean to push that data or have the data pulled into your main system. Every system should have antivirus and be configured properly and not excluding any of those items on the discs. Now because security is layered the rest of your infrastructure should be hardened and have GPO's in AD to help prevent people from doing things carelessly.



        It's been my experience that when 3rd party business entities are sharing information then they have contracts drawn up for damages in the event that they accidentally infect you so from a legal perspective make sure you have protection on this going both directions. Having these contracts in place allows the flexibility to be able to trust the source unconditionally where you don't need all of the other precautions or security. Just remember either it's secure and users can't use it OR the users can use it but it's NOT secure. You can't have both, it's just a sliding scale between them. I've had some businesses push for security to the point that only a few people are able to do things while others want all users to run with Admin rights so they can do whatever they want until something catastrophic happens then they change their minds.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Jul 16 at 18:32









        BradBrad

        7743 silver badges7 bronze badges




        7743 silver badges7 bronze badges













            Popular posts from this blog

            Get product attribute by attribute group code in magento 2get product attribute by product attribute group in magento 2Magento 2 Log Bundle Product Data in List Page?How to get all product attribute of a attribute group of Default attribute set?Magento 2.1 Create a filter in the product grid by new attributeMagento 2 : Get Product Attribute values By GroupMagento 2 How to get all existing values for one attributeMagento 2 get custom attribute of a single product inside a pluginMagento 2.3 How to get all the Multi Source Inventory (MSI) locations collection in custom module?Magento2: how to develop rest API to get new productsGet product attribute by attribute group code ( [attribute_group_code] ) in magento 2

            Category:9 (number) SubcategoriesMedia in category "9 (number)"Navigation menuUpload mediaGND ID: 4485639-8Library of Congress authority ID: sh85091979ReasonatorScholiaStatistics

            Magento 2.3: How do i solve this, Not registered handle, on custom form?How can i rewrite TierPrice Block in Magento2magento 2 captcha not rendering if I override layout xmlmain.CRITICAL: Plugin class doesn't existMagento 2 : Problem while adding custom button order view page?Magento 2.2.5: Overriding Admin Controller sales/orderMagento 2.2.5: Add, Update and Delete existing products Custom OptionsMagento 2.3 : File Upload issue in UI Component FormMagento2 Not registered handleHow to configured Form Builder Js in my custom magento 2.3.0 module?Magento 2.3. How to create image upload field in an admin form