Tag Archives: Enterprise Architecture

Unknowns we MUST factor with our new Data Specific Roles

Data Experts or Gaps in other Business Roles

Very smart people “who simply are motivated by mass processing”  “the bigger the greater the accomplishment or assumption of meeting a higher level of expertise”.   Very smart people in business roles are still scratching their heads about how the audits continue to imply workarounds are being allowed in controlled applications.

The problem is conflicting objectives and motivations without accountable leaders to manage the corporate policies within an organization.

  • Add in two parts social media strategy where workers are encouraged to promote and support positions irrespective of the validity or integrity.

The limitations with this audience relates to the non-technical outcomes caused by big data concepts.

  • Personally, I am not as supportive of data expertise or segregated roles based on the threats the resources can influence and lack of transparency amongst the authorized and entitled stakeholders or exclusions which allow an offline threat even when informed the long standing opposition to process and records as controlled information that must be retained as records via an image versus returned to data from a record format.

I am much more the resource from the following roles;

  • Data people do not agree and have had a long standing opposition with business process stakeholders.
  • Data people also have a long standing opposition with Records Management stakeholders.
  • Data people don’t usually do well with Enterprise Architecture or have traditionally been an afterthought in EA.

If the above is true, who has commissioned the tons of data and for what purpose?

  • The use cases are very much niche or limited audience only
    • Information used to bill or invoice any customer is a protected data set with the highest confidentiality within an organization or agency
      • Few experts acknowledge this vital corporate policy

What does it mean when a data expert does a data type change or creates a new physical table?

  • The source system may not referential integrity with an offline shadow application

Changing a report, connecting or changing the source data in order to present insights has the greatest threat of being misused by the sponsoring business stakeholder. Far too often the business stakeholder is actually an IT team within a business unit who’s implementation failed adoption or was incorrectly implemented (irregardless of who failed) and the fastest way to mask the problem is to produce insights that say the implementation went well and nothings changed.

Software Assets – Investments

When an organization or agency plans a purchase of software from a 3rd party or an external software company as an example in transforming to cloud solutions, the purchase either includes a one time fee;

  • Public Cloud-An annual subscription by user, plus all cost to get your organization migrated over to the new solution and an annual maintenance fee.
  • Hybrid – An annual subscription by user, a partial in house solution with cost of any in-house servers and storage, migration and annual maintenance fee.

In the case of a cloud solution, the annual maintenance should cover all normal events.  If you find your organization being asked to pay for cost or hearing “performance issues” or customizations that prevent your organization from introducing updates on a regular basis. These are all symptoms of an operational, quality and governance situation.

In some situations;

  • the cloud provider may not be qualified to manage or host a cloud solution.

Your 3rd party software provider – has a list of exclusions and charges added fees due to implementation decisions

Cost Saving Projections

If you were supplied a proposal to purchase software for a business unit, you must ask questions about the return on investment or total cost of ownership.  Far too often business people assume software solves people, process or technology problems without performing the appropriate process benchmarking exercise with people who are unbiased.  People who are not going to be politically motivated, not to say these resources would not be sensitive to the stakeholder political points of influence, the most qualified resources being outsourced to an organization specializing in Benchmarking or a mature organizations Business Architect leading with Enterprise Architect validating.

Having an inventory of the capabilities and tools purchased with the same capabilities isn’t unusual, not all software tools are known for or specialize in all capabilities instead the vendors design a solution that a company can buy and use without any integration to meet small and medium customers needs and the large global customers need to rely on their IT Architecture teams and data architects to redirect the software to the source based on policy and procedures defined at the corporate policy level.

An example; Customer capabilities

Every software package I’ve evaluated in the past 20 years, has the ability to create a customer.

A party management capability ensures the appropriate controls are designed into an architecture with a segregation of duties by design.   The create, read, update and archive must be honored and monitored or proven to annual audit stakeholders and may force an organization to restate their performance in cases where an organization or agency fails to manage the due diligence scope prescribed in General Accounting Acceptable Principles.  Non-GAAP revenue outside the innovation or advanced technology risk typically reported in a footnote or excluded from GAAP performance.

If an Enterprise and Business Architecture team isn’t an invested resource supporting the organization or agency or if these resources are unaware of the risk and severity.  We have a large audience who’s perfectly aligned to their culture and the unfortunate risk for the company happens to be a WICKED problem.

Test my theory;

  • Has your organization agreed upon the definition of a customer?
  • Does the definition align to Business Dictionary.com or was a decision made internally by consensus?

Define the Problem in measurable terms

The organization has operational waste that may has a high probability to impact both top and bottom line financial impacts, considering the stakeholders in an operational waste situation the probability is high for threats to reputation, competition and regulatory risk.

Measure the Problem

  • How many applications have the customer create capability?
    • Have the features been disabled?
    • Have the features been reduced to search from a trusted source (ERP not tables in an EDW)?
  • If the application answers “no” to either of the above ask the following;
    • How many of the features data points are prompting a user to enter without searching first?
    • How many of the data points are controlled to the ERP entry key controls.

Analyze the Problem

  • The results of application features not disabled must be considered people and process waste for the users in the application workflow.
  • The results of application features not disabled must assume forced rework, over-production, over processing, and lower quality in fact another back-end process may be in place to cleanse or match these forced entries.
    • Nearly all organizations outside the control super user  group have an incentive to use the capability.
      • Most employees are not expected to understand how these corporate policies and procedures translate into the span of the employees control.
        • Issue: Culture doesn’t promote the employees accountability
        • Issue; Leadership Gap or Management Deficiency
        • Issue; Employee Competency Gap

Containment transition to Improvement

Motivation Cost Effectiveness and Efficiency-Financial, Competitive, Reputation, and Regulatory

Physical Security – Executive Officers 404 assurances-The part of the risk that D&O insurance cannot cover for any executive officer.

  • Each of the types of offers on a revenue transaction which are advanced or innovation (emerging) offers should have not been billed and MUST be deferred until First Customer Ship, + all items in the offer have been delivered and system turned up and accepted by a customer.
    • Every transaction prevents the accuracy by not deferring all revenue instead the norms are to allow the billing internally which often fails to reference the revenue recognition rules to meet sales projections or marketing forecast without a true performance gain.
    • ability to accurately report financial statements-all exclusions must be deferred for up to a year or more in cases of innovation or advanced technology.
  • Each user in the application violates up to 15 key controls with all transactions and consuming processes considered operational efficiency opportunities.
    • Include all threats for each transaction – understated in most cases only a single threat is reported.
    • In cases of a customer, the top line and bottom line financial impacts are far more costly than reported.
  • Each users transactions must be reduced from revenue and reduced from cashflow;
    • Reporting is forcing manual tools or analytical solutions outside ERP.
      • Not having met the control or procedures defined in corporate policies and international standards would expect all transactions be excluded from revenue.
        • Only reported as Non-GAAP revenue.

In fairness, the above would be the worst case scenario and it certainly isn’t politically correct to take this to your executives.

  • Instead most executives will support a security and resilience transformation.
  • The most effective way to transform these oversights.


Imagine being in a role where the executive hands you a stack of papers and wants to know why the organization has more than 300 applications charged by IT each quarter?

If you are in charge of tools and just walked in after implementation of a host of new 3rd party applications were launched and users refuse to use the new investments.

Well, imagine you retire or submit the retirement plan and move to a new role where you can ensure and influence the retirement of the legacy out of scope or shadow applications.  What if you are unaware the new team you joined actually happens to be the reason these applications are being charged to your former group?

  • What if you tell your new manager who designed half the applications that feed the list or use the list?
  • What if he announces he’s leaving the company immediately after you influence him to the depth and severity of the issue?

We must acknowledge resource motivations and respond with a solution that contains the threats to an organization.

  • Data people rarely acknowledge the in house analytical applications as an APPLICATION when passing through the SDLC.
    • The result includes a work around to following the security requirements of an organization.
    • The checklist mentality is being practiced.

I can agree only in situations where party, offer and financial account codes are not included in the scope.

  • I’d agree, the analytical component MUST imply a separation of concerns.
    • This cannot be true the way we define Big Data today.

The analytical component changes to a transactional shadow application when “get” or “create” commands are in the code.

  • Transactional application (shadow) must be managed for all security components.

What if those applications are only considered an application because of a transformation done outside the system of record.

  1. What if you performed time and motion studies on all roles in this functional audience?
  2. What if less than 50 of the legacy applications were being used by stakeholders?
  3. What if waste was being forced into the workflow on average seven times?
  4. What if six times the need was overstated in forecast and manufacturing?
  5. What if six times the need was being purchased and cost overages included people and process?

Unfortunately, Big Data has proven to include the above behaviors

Big Data Unknowns

How to identify if your organization has assumed a risk by investing in Big Data?

If your organization happens to be investing in and promoting Big Data; especially following a security and resilience transformation you may have not gained the benefits you intended.

You are unlikely to be positioned well for the cyber=security requirements and need to spend some time understanding the severity of risk with a discreet assessment.

If your organization hasn’t defined the priority zero data loss and zero downtime scope or you have more than 15 applications in this class of your resilience response testing each release, I’d suggest a discreet security assessment.

Priority 1 – Probable applications and systems in the zero data loss and zero down time class

The ability to create master records, with the Enterprise Resource Planning, Opportunity Management, Application Tracking System, CMBD, Extranet/Intranet, Records Management, Project Accounting, Supply Chain and Service Management Modules.

These transactions are the leverage points for most privacy, compliance, and security classification with highest protection or need to know only.

The threats for these transactions are always going to have the highest probability for all four security principle risk types in any organization or agency;

Retirement of Legacy Systems

In cases of any of the above mentioned gaps in your Enterprise Architecture or hosted solution providers;

  • The results are reported in a variety of ways without any in house expertise nor anyone having the motivation to understand the issues across functional boundaries.
  • Symptoms include lower quality customer data, as these other application users are being forced to enter a field they are not qualified nor authorized to enter.
  • Symptoms include higher number of null values in party tables
  • Symptoms include higher number of duplicate indices
  • Symptoms include higher pricing disputes
  • Symptoms include higher tax issues or disputes
  • The projected savings suggested in the original business case, often doesn’t factor retirement and in many cases people are not informed that the legacy applications were designed into the new application and cannot be retired.
    • Proposed TCO/ROI must assume no benefits to cost before or no possible savings when the cost has likely doubled.
    • The issues are always faulting the legacy system, in fact reporting issues against the new systems are far more likely to be impossible.

Be cautious when noticing anomolies in the reporting around the subjects, as you will find a report was modified.  The changes were unlikely to be put in place.

  • You did not achieve the benefits.
  • You have a report that shows “what if” you invested in the effort and it looked attractive.
    • However, no one took ownership of retirement of legacy systems.
  • You may have paid for the effort, but honestly moving or simply adding to the chaos was the only solution for data people are rarely able to nor held accountable for the dependencies as they are operating outside the system of record.

Offline Reporting or Analytical Insights

Instead you are the proud owner of a report that reflects a subjective view of the organizations performance

Insightful People

Don’t be concerned, you are not alone.  Many leaders fall for the “we can do that for only a fraction of the cost”, we are agile unlike most of the stakeholders.

  • Higher performance is proposed but you end up with higher TCO
  • Higher ROI (returns on investment) and you are managing the technology out of specification

Performance Gap

To help us understand how many people fell into the same trap, try counting the duplicate indexes or null values in fields like the zip code or state.

Benchmark the Big Data Solution

Any person, organization or agency should be reviewed in this context.  You will find that the largest tables are customers.   Now segment the three sub-types.

  • Identify the customer
  • Identify the supplier
  • Identify the employee

Assume every table suggests a unique list of customers or a list that has the subjective rules applied by people who build custom shadow analytical solutions.

Let me share the difference between analytical applications and operational applications.  Today these lines are blurred.  An analytical application is not controlled at the application source; it acquires the information transforms and then reloads.

Hail Mary

The primary root cause of data quality failures “the ETL” process and even fewer records are maintained by the developers.

The primary tool used to modify records or connect unconnected data sets happens to be an integration or ETL solution.  Curated or processed information being the newest and most relative terms used to articulate offline shadow applications.

These people are doing their work in fire drills and even the best Architects are only as effective as their clients demand them to be.  These resources are Miracle workers.  The challenge is they are not the right people to solve the problems.