Quantcast
Channel: Symantec Connect: Data Loss Prevention (Vontu) Customer Group
Viewing all 127 articles
Browse latest View live

DLP Flex Response error message

$
0
0
Ja, ich suche eine Lösung

Dear all:

When user NT2486 press the Flex Response button, it will show the png file;"An unexpected error has occurred. This could be due to one of the following: 1) Your session timed out and you selected a link that was no longer valid, 2) You used the browser back or forward button placing the system into an inconsistent state, or 3) The system experienced a temporary problem", but the incident shows normal process.

Did anyone meet this situation before? can resolve this problem?

 

Thank you

Chris


Upgrading to IT Analytics for Symantec Data Loss Prevention 3.0

$
0
0

Current users of IT Analytics for Symantec Data Loss Prevention 2.0 can now upgrade their installation to the new 3.0 version recently released by Symantec and gain significant benefits in both reporting and performance. This article outlines the process of upgrading from IT Analytics for Symantec Data Loss Prevention version 2.0 to version 3.0 in a simple, step-by-step format.

Upgrade Checklist

Before you perform the upgrade in your environment, consider the following:

  • This article assumes you will be upgrading on the same server. If you are moving to another server and installing IT Analytics for Symantec Data Loss Prevention 3.0 at the same time, consider the following article on migrating an IT Analytics installation.
  • Ensure the version of the Symantec Management Platform you are running is at least 7.1 SP2. If it is a prior version, you will need to upgrade the Symantec Management Platform before upgrading IT Analytics.
  • Perform a backup of the server hosting the Symantec Management Platform and IT Analytics Data Loss Prevention, using the backup tool of your choice.
  • Perform a backup of the CMDB database and the IT Analytics database in SQL Analysis Services (if SQL is hosted off-box). For more information about how to back up the CMDB database, see the following knowledge base article.
  • This article assumes you have administrator access to the Symantec Management Console.
  • Record the following configuration settings in the Symantec Management Console, in the event you might require to configure similar settings after the upgrade:
 DLP IT Analytics connection settings to Analysis Services and Reporting Services under: Settings > Notification Server > IT Analytics Settings > Configuration
                      Connection settings to the DLP database under: Settings > Notification Server > IT Analytics Settings > Connections > Symantec Data Loss Prevention
 Processing schedules under: Settings > Notification Server > IT Analytics Settings > Processing

CAUTION: When you initiate the upgrade of IT Analytics from 2.0 to 3.0, the existing cubes, and reports are uninstalled due to the change in schema between versions. The new out-of-the-box reports and cubes must be reinstalled once the upgrade has completed. If you have customized any of the out-of-the-box cubes and reports in version 2.0, you must reapply those changes after upgrading to the 3.0 version. Any net new reports or cubes that were created in the previous version are not affected by the upgrade, however because of schema changes with the new version, they may not work as expected. If you have not modified the existing cubes or reports and have not developed any new cubes or reports, there are no additional steps beyond what is listed below.

 

Starting the Upgrade Process

Follow the steps below to upgrade to IT Analytics for Data Loss Prevention 3.0: 
  1. Open the Symantec Installation Manager by clicking: Start > All Programs > Symantec > Symantec Installation Manager, and allow the application to load.
  2. On the Installed Products screen, should see at least one product available for upgrade.

NOTE: Clicking on 'Upgrading installed products' will allow you to upgrade to the latest version of IT Analytics for Symantec Data Loss prevention, however this may also include other product upgrades or Symantec Management Platform maintenance packs along with it. For the purposes of this article, we will use a method to upgrade only the IT Analytics for Symantec Data Loss Prevention version, as described below.

  1. Click on the Install New Products link at the top and on that screen, change the filter from Suites to Solutions.

  1. Scroll down the list, and check the Symantec IT Analytics Data Loss Prevention Pack 3.0, or simply search for 'analytics' in the upper right to do a quick find.

  1. Click Next.
  2. Optional - On the Optional Installations page, select the Language Packs for installation and then click Next.
  3. On the End User License Agreement page, verify that the correct products were selected, check 'I accept the terms in the license agreements,' and then click Next.
  4. Verify that your contact information has not changed and then click Next.
  5. On the Review Installation Details page, verify that Symantec IT Analytics Data Loss Prevention Pack 3.0 is listed.
  6. Click Begin install to start the download and installation process.
  7. If you are prompted to backup Notification Server cryptographic keys click Skip. This step is not necessary for upgrading to IT Analytics for Data Loss Prevention 3.0.
  8. Verify the Installation Complete screen is displayed and click Finish.
  9. On the resulting Installed Products screen, verify that the version for IT Analytics for Data Loss Prevention is now listed as 3.0.

 

Reinstalling Cubes and Reports

Once the upgrade completes, you need to reinstall the cubes and the reports that are included in IT Analytics for Data Loss Prevention version 3.0.

Reinstalling Cubes

  1. In the Symantec Management Console, on the Settings menu, click Notification Server > IT Analytics Settings.
  2. In the left pane, expand the Cubes folders.
  3. In the Cubes page, click the Available tab.
  4. Check all the cubes that you want to install. To install all of the available cubes, in the header row of the table, click Install.

article28-5_0.png

  1. Click Save Changes.
  2. At the prompt, click OK to proceed with the installation.
  3. IT Analytics Event Viewer window displays the progress of each cube that was selected. Click Close when the process has completed.

article28-6_0.png

  1. Verify that the cubes were successfully created by clicking the Installed tab, and then review the list of cubes. 

 

Reinstalling Reports

  1. In the left pane, expand the Reports folders.
  2. In the Report Setup window, click the Available tab.
  3. Check all the reports that you want to install. To install all of the available reports, in the header row of the table, click Install.

article28-7_0.png

  1. Click Save Changes.
  2. At the prompt, click OK to proceed with the installation.
  3. IT Analytics Event Viewer window displays the progress of each report that was selected. Click Close when the process has completed.

article28-8_0_0_0.png

  1. Verify that the reports were successfully installed by clicking the Installed tab, and then review the list of reports.

 

Reconfiguring the Cube Processing Tasks

You can create and assign processing schedules for all installed cubes. Your business needs to dictate how often the cubes should be processed. For a typical configuration, all cubes should be processed daily. This task is essential for IT Analytics to function properly because the cubes do not contain any data until the cube processing is complete.

Note: If you had previously created cube processing tasks in the 2.0 version, those tasks should still be available after the upgrade, but because the cubes were uninstalled and reinstalled, you will have to reassociate the specific cubes with the apprpriate processing tasks. Also, keep in mind that the new Incident Status History Cube will have to be assigned to a processing task.

To reconfigure the cube processing tasks:

  1. In the Symantec Management Console, on the Settings menu, click Notification Server > IT Analytics Settings.
  2. In the left pane, expand the Processing folders. You should see that all cubes require processing.

article28-9_0.png

  1. If only using the default processing task, select the schedule that you want and then check the Enabled box. Symantec recommends processing cubes no more than once a day, depending on the number of cubes and amount of data in your environment. If you are using previously configured processing tasks, check that the schedules are in line with expectations.
  2. Check the box for each available cube that you want to be processed on the current schedule. For a typical configuration select all cubes, however depending on the amount of data in your Oracle DLP database, you may need to create mulitple processing tasks for optimum performance.
  3. Click Save Changes and confirm that the processing task is saved.
  4. You can either wait until the scheduled processing time, or click Run Now. The selected processing tasks start asynchronously, which means that the task does not finish by the time that the page refreshes. This task can take several minutes to execute. The execution time depends on the number of the cubes that are selected and the size of data within the database. You can monitor its progress by viewing the events in the IT Analytics Event Viewer window while the manual processing task executes. 

article28-10_0.png

  1. After the processing trace has completed, click Close and you should notice that all of the cubes have now processed.

article28-13_0.png

 

Verifying Your Upgrade

After cube processing completes, you can verify your upgrade and ensure that all of your configuration steps complete successfully.

To verify your upgrade:

  1. In the Symantec Management Console, on the Reports menu, click All Reports.
  2. In the left pane, under IT Analytics, expand the Cubes folder and then click on the new Incident Status History cube.
  3. From the pivot table field list, drag in Status Changes and Incident - Product Area to create a quick cube view and ensure you are getting data. This will indicate that both the upgrade and cube process completed successfully.

article28-11_0.png

  1. In the left pane, under IT Analytics, expand the Reports folder and then click on the new DLP Remediation - Incident Search report. You should also see a much longer list of reports than was there previously. Once this report loads, it will indicate that the new reports from the upgrade were installed successfully.

article28-12_0.png

 

Considerations When Upgrading from Symantec Data Loss Protection 11.x to 12.x

In cases where you have already installed IT Analytics for DLP 3.0 in a Symantec DLP 11.x environment and then later upgrade to Symantec DLP 12.x, you may observe an issue where cube no longer process after the upgrade. If this is the case, you will need to reapply the DLP database credentials to the IT Analytics DLP connection to process cubes successfully. This is due to some changes in schema between Symantec DLP 11.x and 12.x. All that is necessary here is to simply reapply credentials to the backend Oracle DLP database, which then triggers a script in the background to update the appropriate views within DLP IT Analytics to the correct version for Symantec DLP 12.x.

NOTE: These steps are only necessary if you have upgraded your Symantec DLP environment from 11.x to 12. X after you have installed DLP IT Analytics 3.0.

To reapply DLP IT Analytics 3.0 connection credentials once upgrading to Symantec DLP 12.x:

  1. In the Symantec Management Console, on the Settings menu, click Notification Server > IT Analytics Settings.
  2. In the left pane, expand the Connections folders.
  3. Click Symantec Data Loss Prevention.
  4. In the right pane, select a server that has already been configured as a connection from the drop-down list. The information appears for the server that you selected.
  5. Re-input the DLP database credentials following fields:
  • DLP Database Username  
  • DLP Database Password  
  • DLP Database Password Confirmation
  1. Click Apply

Upgrading to IT Analytics for Symantec Data Loss Prevention 3.0

$
0
0

Current users of IT Analytics for Symantec Data Loss Prevention 2.0 can now upgrade their installation to the new 3.0 version recently released by Symantec and gain significant benefits in both reporting and performance. This video outlines the process of upgrading from IT Analytics for Symantec Data Loss Prevention version 2.0 to version 3.0 in a simple, step-by-step format.

Video hochladen: 
Vendor Specific Configuration
2683226300001
Public

IT Analytics for Symantec Data Loss Prevention 3.0 - Cube Processing Recommendations

$
0
0

Cube Process Scheduling Recommendations

IT Analytics for Symantec Data Loss Prevention 3.0 extracts data from the Oracle DLP Enforce database(s) on a scheduled basis. The extracted data is then stored in multi-dimentional cubes within the Microsoft Analysis Services database, that once processed, act as the data sources for the reports and dashboards in IT Analytics.

The frequency of the cube processing schedules will determine how current the data in the cube is. Depending on business requirements, this frequency may vary, but the general recommendation for cube processing is once a day for some cubes and weekly for others (as described below). Note that there are several variables that affect the duration of cube processing tasks but the two major factors are:

  1. Hardware specifications of the SQL Server hosting Analysis Services
  2. Amount of data being processed (i.e. overall size of the Oracle DLP database)

The lower the hardware specifications of the SQL server and the greater amount of data to process, the more time it will take and vice versa. To optimize cube processing performance, it is recommended that you create two separate tasks that will process cubes on two different schedules, per the list grouping below:

Group 1 Cubes (Process Daily)Group 2 Cubes (Process Weekly)
DLP Incident Summary CubeDLP Incident Details Cube
DLP Discover Incident Summary CubeDLP Discover Incident Details Cube
DLP Endpoint Incident Summary CubeDLP Endpoint Incident Details Cube
DLP Network Incident Summary CubeDLP Network Incident Details Cube
DLP Agent Status CubeDLP Policy History Cube
 DLP Incident Status History Cube
 DLP Discover Scans Cube
 DLP Incident History Cube
 DLP User Action Audit Cube
 DLP Network Statistics Cube

The first task will include all the DLP summary cubes and be processed daily. This should provide enough information on a daily basis to give end users the visibility they need into their DLP environment. The second process includes the more detailed and historical cubes which only need to be processed weekly. This orientation helps to expedite cube processing and ensure the right data is available for end users. 

 

Cube Processing Benchmarks (General Estimates)

Your business requirements may stipulate that data must be updated daily, as such all cubes may need to be processed each day. Using the cube groupings outlined above, you can run these tasks sequentially on a daily basis, however be careful to allow enough time for the first task to finish before the next one begins. Again, depending on hardware resources and amount of data in the DLP database, this will take some trial and error to optimize completely. To help you start this task, the tables below provide administrators some general benchmarking estimates for cube processing (based on environment size and hardware specifications) in order to determine the approximate times necessary for your environment. 

NOTE: The processing intervals listed below are estimates ONLY. Your times will vary based on the hardware specifications and amount of data in your environment. These times are offered as general guidelines only.

 
Incident Count
Small
Medium
Large
Endpoint Incidents

5,000

10,000

4,000,000

Network Incidents

40,000

500,000

4,000,000

Discover Incidents

10,000

50,000

1,000,000

 
Hardware Component
Small
Medium
Large

Hardware Type

Virtual

Virtual

Physical

Processors

Quad Core

Eight Core

64 Core

RAM

8GB

8GB

256GB

The table below provides guidance on the impact the SQL Server hardware (as defined above) has on the time it takes to process a given cube.

IT Analytics DLP Cubes

Processing Times per SQL Hardware Options

Small

Medium

Large

DLP Administrative Events Cube

10s

10min

30min

DLP Scans Cube

30s

5min

30min

DLP Agent Status Cube

20s

20s

1hr

DLP Network Incident Summary Cube

3mins

30min

2hrs

DLP Discover Incident Summary Cube

4min

5min

3hrs

DLP Endpoint Incident Summary Cube

3min

1min

3hrs

DLP Incident Summary Cube

3min

30min

3.5hrs

DLP Incident Status History Cube

30min

2hr

4.5hrs

DLP Messages

5s

1hr

3hrs

DLP Network Incident Details Cube

3min

1hr

5hrs

DLP Discover Incident Details Cube

4min

5min

5hrs

DLP Endpoint Incident Details Cube

3min

1min

5hrs

DLP Incident Details Cube

3min

1hr

5hrs

DLP Incident History

3min

1hr

5hrs

DLP Policy History Cube

1min

45min

4hrs

 

Upgrading 11.6 to 12.0

$
0
0
Ja, ich suche eine Lösung

Hi All

We are planning to upgrade DLP 11.6 to 12.0 and wanted to know if there are Any features removed in DLP 12.0 from 11.6.

I've checked in Whats New guide and Release notes but don't see any removal of feature as such so thought of asking DLP experts. Thanks

 

 

Need admin guide and install guide for Sym DLP

$
0
0
Ja, ich suche eine Lösung

Hi,

 

I need the Admin guide and Install guide for both versions 12.0 and 11.6.3???

Can somebody help

1380070379

IT Analytics for Symantec Data Loss Prevention - Glossary of Terms

$
0
0

IT Analytics introduces powerful ad-hoc reporting and business intelligence tools, and along with it a few terms that may be new to you. To alleviate any confusion, this article describes a few key terms so that you can easily understand out-of-the-box functionality and start using the tool to gain deeper insight into your DLP data to make informed decisions.

TermDefinition
MeasureMeasures are the aggregate count, or how you quantify results when creating a pivot table view. These typically make up the columns in your report. Every view you create must contain at least one measure. (For example: Incidents Count)
DimensionDimensions are a grouping of specific data types you are quantifying when you create a pivot table view. These typically make up the rows in your report, but dimensions can be used across columns or as filters. Every view you create must contain at least one dimension. If you have more than one dimension, you can drill in and out or change the order of dimensions to arrange the report the way you want it. Please see the Connect article for a list of all dimensions in IT Analytics.
AttributeAn attribute is a sub-grouping of data types for a specific dimension. A dimension may have one or more attributes and these can be used like any other dimension. (For example: Policy - Description, Policy - Status, Policy - Name, Policy - ID). Please see the Connect article for a list of all dimension attributes in IT Analytics.
Key Performance Indicator (KPI)
Quantifiable measures that represent a critical success factor in an organization. The emphasis is on the action of quantifying something in the environment. The KPIs must be measurable to successfully be monitored and compared against a given objective. (For example: Number of Alerts in the Last 30 Days). Please see the Connect article for creating a key performance indicator in IT Analytics
CubeMultidimensional data structures (as opposed to a relational database) that store precompiled information from the DLP Oracle database(s). Cubes contain measures and dimensions that are arranged in a specific way for common reporting purposes. These are the underlying source for all reporting in IT Analytics and are stored in the Analysis Services of SQL Server. Please see the Connect article for a list of all cubes in IT Analytics.
Report or DashboardPre-developed reports that are hosted by the Reporting Services component of SQL Server. Several out-of-the-box reports and dashboards are available upon install and you have the flexibility to create your own through Report Builder.
SQL Analysis ServicesThe free component of SQL Server that hosts and processes all cubes within IT Analytics. This component is required to install IT Analytics. Please see the Connect article for configuring Analysis Services and installing IT Analytics
SQL Reporting ServicesThe free component of SQL Server that hosts all reports and dashboards within IT Analytics. This component is required to install IT Analytics. Please see the Connect article for configuring Reporting Services and installing IT Analytics
Report Builder
Report Builder is a client-side application (developed by Microsoft and free with Reporting Services) that you can use to create and design reports. Using Report Builder, you can design reports that are based on your data from within IT Analytics, without having to understand the underlying schema or complex programming languages. Please see the Connect article on creating custom reports in Report Builder.
Pivot TableAn arrangement of measures and dimensions from a specific cube in tabular form, with the goal of creating an ad-hoc report. Please see the Connect article on working with pivot tables in IT Analytics
Pivot ChartAn arrangement of measures and dimensions from a specific cube in chart format, with the goal of creating a visually informative report. Please see the Connect article on working with pivot tables in IT Analytics
Content Pack

A software component that bundles cubes, reports and dashboards specific to a particular Symantec solution suite. IT Analytics content packs are currently available for:

  • IT Management Suite (Altiris)
  • Symantec Endpoint Protection
  • Data Loss Prevention
  • Critical System Protection
  • ServiceDesk
ParameterTypically a dimension attribute used to filter data within an IT Analytics report or dashboard. This technique is used within Report Builder when creating reports.
Processing ScheduleThe given frequency that data will be purged and then recompiled within the IT Analytics cubes. Typically this is done once a day, but depending on environment, server resources and business requirements, this can be set to process more frequently. This schedule is set within the configuration page of IT Analytics, but the processing itself occurs within SQL Analysis Services.
Symantec Management PlatformThis application hosts the IT Analytics configuration and reporting interface. It is required to install IT Analytics. Please see the Connect article on installing the Symantec Management Platform
Symantec Installation ManagerThis application allows you to download, install and update software hosted by the Symantec Management Platform, including IT Analytics. To install the Symantec Installation Manager, please download the IT Management Suite trialware from Symantec's trialware site

 

Deriving Business Value from IT Analytics Symantec Data Loss Prevention 3.0 Reports

$
0
0

The newest version of IT Analytics Symantec Data Loss Prevention Content Pack 3.0 enables DLP program managers to easily leverage numerous reports and dashboards that communicate the value of the DLP programs to executive sponsors and business unit stakeholders. It enables in depth visibility on DLP operational data across its functions for auditors, IT managers and incident remediator’s managers. In this latest release, significant effort was put into focusing on several real-world business use cases to maximize the overall investment in the Symantec Data Loss Prevention tool.

This document takes a detailed look at the almost 40 reports available out-of-the-box in IT Analytics Symantec Data Loss Prevention by providing report descriptions, tailored use cases and resulting business value, and poses questions that can be answered by utilizing IT Analytics to improve an organizations risk posture.


Modifying IT Analytics Reports to Decrease Load Times

$
0
0

For IT Analytics reports hosted by SQL Server Reporting Services (which include out-of-the-box reports and dashbaords) some users may experience extended wait times before the report is able to render, and in some cases the report may even fail or timeout. This typically occurs in environments where there are extremely large data volumes and where reports are pre-configured to display all data by default (without filtering).

To minimize the time it takes for some IT Analytics reports to load, steps can be taken to configure settings so that data will be filtered by default, hence greatly improving performance. This article will describe such process, utilizing Microsoft Report Builder a client side aplpication which comes with SQL Server Reporting Services.

 

Selecting the Report

  1. To load Report Builder, open Symantec Management Platform Console and navigate to: Settings > Notification Server > IT Analytics Settings > Reports > Report Builder. Then click on the Launch Report Builder button.

article31-1_0.png

  1. If prompted, click Run to start downloading the application.
  2. Be patient, launching application may take a minute or two, depending on connectivity. While the application is loading, you should see this message:

article31-2_0.png

  1. Click Open to select the report you want to optimize. Under the default configuration, all IT Analytics reports are stored within the ITAnalytics folder off the ReportServer root. 

article31-3_0.png

NOTE: For the purposes of this example, we will select the IT Analytics Configuration Events report. As such, the parameters you select in the next section of this article may be different.

 

Modifying the Report

  1. When report opens, expand Parameters in the Report Data pane on the left, then right-click the From parameter and select Parameter Properties.

article31-4_0.png

  1. Navigte to Default Values > Get values from a query. For both fields select MaxDate from drop-down list, then click OK.
article31-5_0.png
 
NOTE: Write down original values prior to making any modifications.
  1. Modify the properties for the Types parameter and navigate again to Default Values. Select Specify Values and Add new value. In the field type in 'Cube Processing' (without quotes) as shown below, then click OK. If you are modifying a different report, select an appropriate paramater and input a default value that matches a value for that parameter.
article31-6_0.png
  1. Modify the properties for the Targets parameter and navigate again to Default Values. Select Specify Values and Add new value. In the field type in 'Processing Trace' (without quotes) as shown below, then click OK. If you are modifying a different report, select an appropriate paramater and input a default value that matches a value for that parameter.
article31-7_0.png
  1. Click Save to apply all changes (saving may take couple seconds).
article31-8_0.png
 
 

Reloading the Report

  1. In the Symantec Management Console, navigate to Reports > IT Analytics > Reports > IT Analytics Events > ITA Configuration Events (or a the appropriate report you modified). Refresh the browser if necessary. You should notice the report load much faster than previously.
  2. Verify that the current day (in the From parameter) and the specific Type and Target are pre-selected with the values we input previously (or the appropriate values you entered if modifying a different report).

article31-9_0.png

  1. To adjust the parameter values, you can select different values as you normally would.

article31-10_0.png

 

Preventing IT Analytics Reports from Running Automatically

$
0
0

By default, reports and dashboards in IT Analytics (hosted by SQL Server Reporting Services) will run automatically when selected. This behavior can sometimes be problematic for environments with excessive amounts of data, resulting in increased load times or timeouts for certain reports. One option to prevent this is to modify the value of a parameter within a report so that it does not execute automatically when clicked, a process which is documented in the article below.

To modify report parameters in more detail in order to load reports more efficiently, please see the article: Modifying IT Analytics Reports to Decrease Load Times.

  1. For the purposes of this example we will use the DLP Endpoint Incident Search report. Note that by default, the report runs with the Policy Name parameter value of 'All' pre-selected.

article33-1_0.png

  1. To change this report, open a browser on the server hosing SQL Reporting Services and goto: http://localhost/Reports.
  2. From within the IT Analytics folder, locate and hover over the DLP Endpoint Incident Search report (or the report you want to modify) and to the right you will see a menu of options. Select Manage.

article33-2_0.png

  1. Once in the report manage screen, select the Parameters section.

article33-3_0.png

  1. Once there, uncheck the 'Has Default' column from the parameter you want to force users to select or enter information on. This will ensure a user must select a value for that parameter before the report can be executed. In this example we will use the Policy Name parameter, then click Apply.

article33-4_0.png

  1. Return to the Symantec Management console to load the report and you should notice that it does not automatically execute the report, but instead prompts for a parameter value. Once a value is selected click View Report. Refresh the browser if necessary. 

EDM Best Practice- multiple token source data, and handling of empty cells

$
0
0
Ja, ich suche eine Lösung

I am in the process of designing EDM indexes for customer data but have encountered a design issue relating to cells containing multiple tokens. I would greatly appreciate it if anyone could confirm:

1) how DLP handles empty cells (not entire columns, just cells) and if there is any performance impact resulting from this.

2) what the best practice or recommended approach is to creating EDM indexes for multi-token customer data

 

I know that EDM cannot match unstructured content against multiple token index cells. Many customer names, however, have multiple first names and multiple surnames (the example in the DLP training materials is "Mary Jane" and "von Batten".  The two options, as far as I can see, are:

 

1. Remove part of the name, leaving only one token and excluding all others from the matching process, e.g. "Mary" or "Batten". The obvious downside to this is that you are excluding potentially key data.

2. Splitting the original name out into multiple cells, allowing matching to be performed against all parts of the name, utilizing EDM.SimpleTextProximityRadius to reduce false positives.

 

The second option here would work perfectly, providing the index file was created with enough columns to accommodate the longest customer name. This would, however, result in empty cells for any customers who have shorter names. For example:

Row 1:  Mary | Jane | von | batten | - all cells are filled with data.

Row 2: John | | | Smith |          - note the two empty cells here.

If anyone has encountered this issue themselves or has advice regarding best practice, I'd greatly appreciate your input.

Robin

Endpoint Discover: Two full scans, very different results

$
0
0
Ja, ich suche eine Lösung

I am using Endpoint agent v12 to do DAR discovery for PCI data on one production PC.

  • One vanilla PCI DSS policy - No exceptions to the credit card number DI, Wide breadth, No optional validators, Count all unique matches (at least 1 match), Subject/Body/Attachments. No EDM or IDM.
  • The Discover Target does not contain any Include/Exclude filters for file types or location, nothing filtered by size or date, 'Only scan files added or modified since the last full scan' is unchecked and 'Make next scan a full scan' is greyed out. Scan idle timeout is 10 minutes and Max scan duration is 2 days.

A full scan of the PC is kicked off and takes 8 hours 10 minutes to complete, producing zero Incidents. Time required to scan is about half of what has been seen to scan the same (or similar) PC. Thinking that the scan time and results are a little too good to be true, another full scan is kicked off the following day. The second scan takes 13 hours 48 minutes to complete, producing 125 Incidents.

Looking at the statistics reports for each day, items scanned and bytes scanned numbers are very close, but the items unprocessable numbers differ greatly

  • On the day the scan completed in 8 hours, 53,677 items unprocessable
  • On the day the scan completed in 13 hours, 435 items unprocessable

At a high level, I'm looking at the difference between these numbers as the reason for the shorter scan time. Are there specific places in the agent logs that can help explain this?

Archiving incidents

$
0
0

1] Incident Archiving :

Incident archiving lets you flag specified incidents as "archived." Because these archived incidents are excluded from normal incident reporting, you can improve the reporting performance of your Symantec Data Loss Prevention deployment by archiving any incidents that are no longer relevant. The archived incidents remain in the database; they are not moved to another table, database, or other type of offline storage.

You can set filters on incident reports in the Enforce Server administration console to display only archived incidents or to display both archived and non-archived incidents. Using these reports, you can flag one or more incidents as archived by using the Archive options that are available when you select one or more incidents and click the Incident Actions button. The Archive options are:

i] Archive Incidents - Flags the selected incidents as archived.

ii] Restore Incidents - Restores the selected incidents to the non-archived state.

iii] Do Not Archive - Prevents the selected incidents from being archived.

iv] Allow Archive - Allows the selected incidents to be archived.

The archive state of an incident displays in the incident snapshot screen in the Enforce Server administration console. The History tab of the incident snapshot includes an entry for each time the Do Not Archive or Allow Archive flags are set for the incident.

Access to archiving functionality is controlled by roles. You can set the following user privileges on a role to control access:

i] Archive Incidents - Grants permission for a user to archive incidents.

ii] Restore Archive Incidents - Grants permission for a user to restore archived incidents.

iii] Remediate Incidents - Grants permission for a user to set the Do Not Archive or Allow Archive flags.

2] To archive incidents :

A] Open the Enforce Server administration console and navigate to an incident report.
B] Select the incidents you want to archive, either by selecting the incidents manually or by setting filters or advanced filters to return the set of

    incidents that you want to archive.
C] Click the Incident Actions button and select Archive > Archive Incidents.The selected incidents are archived.

 

3] Restoring archived incidents :

To restore archived incidents

A] Open the Enforce Server administration console and navigate to an incident report.
B] Select the Advanced Filters & Summarization link.
C] Click the Add filter button.
D] Select Is Archived in the first drop-down list.
E] Select Show Archived from the second drop-down list.
F] Select the incidents you want to restore, either by selecting incidents manually or by setting filters or advanced filters to return the set of incidents you  want to restore.

The selected incidents are restored.

4] Preventing incidents from being archived :

You can prevent incidents from being archived using either an incident report or an incident snapshot.

To prevent incidents from being archived using an incident report.

A] Open the Enforce Server administration console and navigate to an incident report.
B] Select the incidents you want to prevent from being archived. You can select incidents manually or by setting filters or advanced filters to return the set of incidents you want to prevent from being archived.
C] Click the Incident Actions button and select Archive > Do Not Archive.
The selected incidents are prevented from being archived.

Note:  You can allow incidents to be archived that you have prevented from being archived by selecting the incidents and then selecting Archive > Allow Archive from the Incident Actions button.
 

To prevent an incident from being archived using the incident snapshot.

A] Open the Enforce Server administration console and navigate to an incident report.
B] Click on an incident to open the incident snapshot.
C] On the Key Info tab, in the Incident Details section, click Do Not Archive.

Note:  You can allow an incident to be archived that you have prevented from being archived by opening the incident snapshot and then clicking Allow Archive in the Incident Details section.

5] Deleting archived incidents :

To delete archived incidents

A] Open the Enforce Server administration console and navigate to an incident report.
B] Click the Advanced Filters & Summarization link.
C] Click Add filter.
D] Select Is Archived in the first drop-down list.
E] Select Show Archived from the second drop-down list.
F] Select the incidents you want to delete. You can select the incidents manually or you can set filters or advanced filters that return the set of incidents you want to delete.
G] Click the Incident Actions button and select Delete incidents.
H] Select one of the following delete options:

i] Delete incident completely -  Permanently deletes the incident(s) and all associated data (for example, any emails and attachments). Note that you cannot recover the incidents that have been deleted.
 
ii] Retain incident, but delete message data -  Retains the actual incident(s) but discards the Symantec Data Loss Prevention copy of the data that triggered the incident(s). You have the option of deleting only certain parts of the associated data. The rest of the data is preserved.
 
iii] Delete Original Message -  Deletes the message content (for example, the email message or HTML post). This option applies only to Network incidents.
 
iv] Delete Attachments/Files -  This option refers to files (for Endpoint and Discover incidents) or email or posting attachments (for Network incidents). The options are All, which deletes all attachments, and Attachments with no violations. For example, choose this option to delete files (for Endpoint and Discover incidents) or email attachments (for Network incidents).

This option deletes only those attachments in which Symantec Data Loss Prevention found no matches. For example, choose this option when you have incidents with individual files taken from a compressed file (Endpoint and Discover incidents) or several email attachments (Network incidents).
 

I] Click the Delete button.

Incorporate quarantine Emails for Exchange

$
0
0

Hi Symanetc,

Incorporate the following features in DLP Discover.

1. Option to quarantine Emails from Exchange Mailboxes by individual Email items with Response rules.

agent not reporting after Endpoint detection server services restarted

$
0
0
Ja, ich suche eine Lösung

hi All,

 

agent not reporting after Endpoint detection server services restarted, attached file is a exported log.

how   to resolve this issue..

thanks


agent not reporting after Endpoint detection server services restarted

$
0
0
Ja, ich suche eine Lösung

hi All,

 

agent not reporting after Endpoint detection server services restarted, attached file is a exported log.

how   to resolve this issue..

thanks

agent not reporting after Endpoint detection server services restarted

$
0
0
Ja, ich suche eine Lösung

hi All,

 

agent not reporting after Endpoint detection server services restarted, attached file is a exported log.

how   to resolve this issue..

thanks

Data Loss Prevention (Vontu) support for SharePoint 2013

$
0
0
Ja, ich suche eine Lösung

Any ideas or a published roadmap for the support of Symantec DLP for SharePoint 2013?

We are looking for an enhanced feature that allows DLP Prevent/Protect functionality with the SharePoint 2013 Web API, to empower customers to block and prevent data from reaching the SharePoint space.

1399492306

SMA & DLP Agent Integration Behavior

$
0
0

This document describes the various scenarios you can encounter when deploying the Symantec Management Agent or the Symantec DLP Agent in your environment when both agents may be present.

Recently we had an issue which resulted in millions of bad events being produced on our Altiris servers because of the automatic integration & registration of the 2 agents. Because we were not using the DLP IC we had to figure out a way to stop the integration from occurring and prevent the DLP Agent info events from being generated during a Basic Inventory from the SMA.

Because of this integration, we had to perform Scenario 1 & 3 to properly split the 2 agents and prevent DLP events from being sent to our Altiris servers.

Symantec Ideas

$
0
0

Do you have an enhancement request that would make Symantec Data Loss Prevention better or would improve the usability of the product?

If you are not aware, Symantec Connect has an ideas portal for submitting and tracking enhancement requests.  Once your idea has been submitted, other community members can add comments and vote on it.  The most popular ideas move to the top where they are reviewed by Symantec Product Managers.  Look here for a quick demo of how ideas works.

Links: 

Curtis Carroll
Symantec DLP Product Manager

Viewing all 127 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>