Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations wOOdy-Soft on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Req Feedback from Fellow ePO/VirusScan Admins 3

Status
Not open for further replies.

VictorySabre

Technical User
Feb 14, 2002
225
CA
To Fellow ePO/VirusScan Admins:

I've been the ePO Admin for our company's head office for about 4 years now managing about 1200 PCs. We've been getting a lot of complaints from our end users about our various agents/scanning activities and how it is impacting their ability to work. Complaints are steadily growing and is making our IT department look bad in the end users' eyes.

I'm interested in getting some feedback from fellow ePO/VirusScan administrators who manage +1000 PCs to determine:

* Are other companies experiencing the same issues and is no different than our situation?
* What have you done in your environment / infrastructure / policies to correct the problem?
* Are we too lax / just right / too restrictive with our VirusScan configuration given the way we are set up?
* Any recommendations that I can take back to management to help steer us in the right direction?

Here are some details about what I have to deal with. Our company develops software for other companies, military, government, etc. The IT department supports our core set of standardized software on all our PCs, but the individual projects may require/install specialty applications that is requested/required by the customer. A large number of our software developers (70% of all employees in this office) have a separate local admin accounts on their systems to allow them to install/upgrade/remove software on their assigned PC. All users have access to the Internet.

VirusScan Config:
For our desktop PCs:
* We have Read/Write On-Access Scanning enabled
* On-Delivery E-mail Scan Enabled.
* On-Demand Scanning enabled to run once a week on Sunday's at 1:00 AM. PCs that are powered off are set to run missed tasks. On-Demand Scans are limited to run no more than 8 hours and are set to 30% CPU utilization.

For our laptop PCs:
* We have Write On-Access Scanning enabled only. Read scanning disabled recently because of complaints.
* On-Delivery E-mail Scanning disabled over a year ago because of complaints that the scanning was affecting Outlook operation while on the road.
* On-Demand Scanning enabled to run once a week on Wednesday's at 2:00 PM. PCs that are powered off are set to run missed tasks. On-Demand Scans are limited to run no more than 8 hours and are set to 30% CPU utilization.

Our software developers are constantly building/compiling software code. Data is stored on the hard drives on their PCs. As hard drives are getting cheaper and in larger capacity, users are storing more and more data on their PCs.

In an ideal situation, only work-related programs and work-related web surfing is performed by the end users, but I know that is not the case. Some people do go to web sites that they shouldn't be going, which brings down adware/malware/trojans, some people are installing software that are not work-related, etc. Because of this, I've configured our settings to be fairly aggressive to try and keep our system as well protected as possible.

Again, I'm interested in getting feedback from other companies to see how they are managing their systems and whether or not you are encountering the same problems we are. If you have a similar environment as ours, what have you done to improve the end users' experience?

Many thanks in advance!
 
i think people moan no matter what you use - as pc's are only quick on the first day you get it.
i run mcafee on about 4000 desktops , about 15 ms servers and about 40 netware servers. i run sophos on about 4000 desktop's as well.

mcafee beats sophos in every single aspect except for the bean-counters price

i dont scan on read just on write
but i do on demand scan daily - at lunch timish
 
Hello both,

we use EPO on 30000 PC's currently. I tend to agree with Terry712 that users will always complain, whatever you use...
As far as I can tell we have not had many complaints about the anti-virus as such, it's more about old hardware (we replace around 8000 PC's per year (of the 40000 in total), so we still have some PC from 2002 left.

configuration:
EPO 3.6
VSE 8.0
Engine 51000
MASE 8.0

We did 1 on demand scan in the beginning and now only from time to time, but nothing is scheduled.

No e-mail scans
(we have sybari, now Microsoft on the exchange servers) One of the problems with e-mail scanning was the extra load on mail servers when for instance some idiot sends an email to lets say 10000 users. Since on reception every one scans the e-mail, the load did not go unnoticed on the mail servers! It also breaks the "single instance" mail storage apparently, but I don't know the details. (normally when a mail is sent to a lot of people, only 1 copy is kept in the mail store, it could be realted to our e-mail archiving product)

No scanning on network drives, as all servers (around 800) have their own anti-virus agents (VSE)

We have extensively used the "high and low risk processes" for high risk, we scan about everything(all extentions), read and write, for low risk we scan a lot less (default extentions that come with dat files)and only on write. We have significantly increased the processes in both lists (e.g. adding our management software processes to low risk,etc...)

Script scan is used, but this one had to be removed on some sites (for web-apps using a lot of java script) Unfortunately, you can not exclude URLs :-( scriptscan is on or off

MASE is part of on access scan, nothing is scheduled, unless we suspect real threaths, (we also did 1 scan in beginning and from time to time)
MASE tends to use a lot of the reports useless unless you use filters in the reports (cookies will be you top ten of all times :) )


That's what I can remember for now, as I was only part of the design team... For some years, the Operations team is running it, and I was not involved anymore but as I said, no real complaints and I'm sure I would have heard about it...

CU
G.

 
Just verified some settings and hereby some extra info...

We don't do an on deman scan, as this is to slow for users, but we do do a "memory scan" (this lists all processes in memory and then verifies the files that started them on the disk... so it is not actually scanning "in memory" but on disk)

we block unwanted programs (fine tuned the lists)
we do access protection
we do scan n registry for traces of malware (with MASE) takes only a minute, even on old PC's

this is still uncomplete, but maybe it's easier if you ask about certain features...

CU
G.
 
Thank you for the feedback Terry712 and gdvissch! We used to get some moaning from our end users and we've always accepted it as part of the protection, but lately we've been getting more louder moaning, even from some high profiled employees in the company (Exec VPs, Project Managers, and even from my IT dept manager!) This is one of the reasons why I've been asked to ask around to see if what I've implemented is sound or if I am being way too paranoid with protection and that I should back off.

I did have some questions for the both of you that I was hoping you could answer please.

Terry712:
1. With your On-Demand Scans that run at lunch time, would you know how long it takes to complete a scan? Our On-Demand Scans are taking anywhere from 4 to +8 hours, depending on how many files are stored on that PC. Do you do a full scan of all files or just selected files. (Default?)

2. How much data is usually stored on your PCs or do you require your users to store them on a SAN/NAS?

gdvissch:
1. I've seen some suggestions as well to use the High and Low Risk Process scanning as well, but is this something you use McAfee's default programs for High Risk or did you add additional programs to the High Risk as well?

2. How has protection worked for you with High and Low Risk Scanning? (Do you find it is able to detect and remove viruses while users surf the web or is it a hit and miss?)

Both Terry712 and gdvissch:
Just wondering if your company is a software development company? If not, what is the line of business you are in? When you purchase PCs to replace older PCs, what kind of specs are you looking at? (Eg. The fastest CPU available, ultra fast hard drives, 1 GB of RAM, etc) And at what age do you consider your PCs too slow and obsolete to be used and needs to be retired and replaced?

Many thanks again for your responses and the info you have provided has been very helpful!
 
scan takes about 30 mins on most pc's - all files - more or default
no data is held on pc's - except for the odd clueless muppet

pc's are on a three year cycle - nearly all run xp but a few nt one's still , most are dell's 240's are oldest - average ram probably 512mb
afraid i'm not really on the pc side of things

i'm using 8.5 not 8 though and this does seem less intrusive than 8
it's still vanilla - havent applied any patches yet - i beleiev there is one out
 
Low and high risk processes is the way to go! This really improves performance. We have done some tests at the time of the design and the difference was very high...
I think we left the list of high risk processes as it was, however we have added a lot of programs to the list of low risk processes.
The way it works, is that the process itself is stil scanned for virusses when loaded. However all files accesded by this process are not scanned anymore (depending on your config)
This has a tremendous impact on processes that read/write a lot
e.g. backup software. When you backup your PC, every file you backup is scanned (is this what you want? knowing that you already do a full scan every day! The same for defrag software etc...We included all our Landesk processes, which changed the logontime significantly. Here the reason was the "inventory module" that scan for all installed software of the PC's... as it was, again this would be a full virus scan at the same time since the inventory accesses every exe on the disk!) I think you start to see the picture... We also made exclusions for software developpement software (we do not want to scan everithing when you compile for instance...) also on servers we excluded processes like oracle, exchange (on top of directories it uses), coldfusion, etc...

When surfing the web, there is really no difference, as all processes that talk to the web are absolutely, certainly "High risk" processes (e.g. iexplore, firefox, outlook express, ftp, filezilla, p2p, instan messaging and the like)

I work for a government agency. PC's lifecycle is 4-5 years (my own PC is one of the oldest, its a P4 1.5GHz 512 MB RAM)
Mainly office use, however we do have an important number of in house developpers as well...

CU
G.
 
i may read up abit on the risk processes. not something i've really played with much.

and afraid i havent got a console near me just now

so from memeory are there three policies for this default , low and high. if you enter the same process in both i assume it will gain most restrictive?

if for example i have say sage running on a box - that has lots of crappy dirs that it uses all from sgae downwards - the exe that lauches - lets call it sage.exe for ease

what happens if i add this to only low
what happens if i add it to only high

i will play with this when back in but that's a bout a week and so you have be thinking
 
You are right there are 3 levels.
Default, low risk and high risk

You can "add" programs to either Low or High (mutually exclusive)
Any program not listed in either low or high automatically becomes "default processes"

What happens with every list is up to you. You have a separate configuration section for each of the tree lists. The settings you see there are exactly the same as when not using this mechanism (it effectively means yo're only using default processes and subsequently you only get to configure all settings only once)

what you could do for example is:
Low risk processes : do not scan on read do not scan on write
High risk processes: scan on read and write
Default processes: scan on write
same example continued
Low risk: scan default extentions
High risk: scan all extentions
default: N/A as you don't scan anyhow (cf supra)
Every list also has its own set of exclusions like what directories not to scan. e.g. if a process on low risk list is accessing a file in d:\temp it will not be scanned however when another process on high risk accesses a file in d:\temp it is scanned
Another possbility is the unwanted programs sections. Again every lists has its own config for this.
Lets take iexplore.exe (on high risk) it scans for all categories of unwanted programs. safe.exe (on low risk) could be configured only to scan for dialers or something.
etc...

Let me know if you still have questions...
CU
G.



 
That is very useful info indeed. We use SMS and whenever it runs its inventory, it is essentially forcing our clients to do a "scan". By putting that into the low risk process, that prevents the scan from running.

I think this is something worth considering and testing in our environment to see if it improves performance here.

Many thanks to you both!
 
This could be pie in the sky, but you'd probably get some benefit if you can get the developers to shift their working files up to file servers (and run server-based AV there instead of on the client). That would also help cover against data loss from local HDD failure too.
 
I see that in my previous post I made a mistake in the example the part about extensions to scan. I mixed up low risk and default processes...


@VictorySabre

SMS inventory is a good example of what to add to low risk.
In practice I would only add stuff to "Low risk processes", McAfee has no idea what is being used at your site so the list would have to be very long to cover all its clients.
They do have a good idea of what "High risk processes" are and these are quite common. Therefore I would leave it untouched or only slightly change it (by adding only and not by removing)
Everyhing else becomes "default"
Other good ones to add are as said before defrag software, backup sofware, anything that reads/writes a lot.
Also notice that since policies are different for servers and workstations, you can have different lists of low/high risk processes for both (in our server policies you find stuff like oracle, which also has folder exclusions, exchange, jrun, coldfusion ...)

@SPV
you're right no important files should be stored locally. However the example of compiling was only to show how it works. *.c files are probably not scanned anyway and the final .exe will be safe (unless you're writing a virus :))
So it does make sense to put compiler.exe in low risk. And this even when files are on the network.
Don't forget that the anti virus on the server also notices the performance impact when it has to scan all these files, so end-user performance might be bad too.

G.
 
Hi SPV,

I agree that the data should be stored on a file server and that should reduce the amount of data being scanned on the developers' PCs. I've suggested that over 2 years ago, but I've been told that won't fly because our culture/environment.

Because of this, we've been forced to do nightly backups on the servers and workstations. Yes, that is silly and a waste of resources and I've heard others who have said the same thing, but if I can't get this suggestion to be approved by management, I'm hoping that the comments provided by other administrators outside of our company will serve as evidence that I'm not crazy in what I'm suggesting and that it can only improve our systems' performance.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top