11

I had an argument yesterday with one of my colleagues. He (a business-analyst, previously a programmer) thinks that he should be aware of the technology used to implement the system, so he can take better design decisions. In my opinion (I am a programmer), an analysis should not be coupled in any way to the technology and I believe that a good analyst can make a great design without worrying about the implementation details.

Am I right to think that way? Are there any reasons why a business analyst would need to know the technology used to implement the system?

EDIT : I believe that I used the wrong term when saying business analyst. Maybe I meant architect, or system analyst. I am not used to these terms. I meant something like architect or system analyst if you prefer.

Thank you everyone for your awesome answers! I am not very experienced yet and I am glad you opened my eyes on this.

Thomas Owens
  • 85,641
  • 18
  • 207
  • 307
marco-fiset
  • 8,791

9 Answers9

18

There are certainly cases where it makes sense for a business analyst to understand the technology at least well enough to understand where it makes sense to question a business user about how important a particular feature would be. For example, if the business is accustomed to the behavior of a fat client application while the new application is going to be web-based, it is likely that there will be many "requirements" that would be trivial in a fat client but relatively difficult with a web-based application. If the business analyst understands whether a request from the business is going to be trivial for the development team or whether it is going to involve 20 hours of AJAX development, they can figure out whether it makes sense to just write down the requirement or whether it makes sense to engage the business in exploring alternatives.

For any given project, there are likely a large number of sets of requirements that would in reality satisfy the business by making various sorts of trade-offs. The more understanding the business analyst has about what trade-offs they are making, the more likely that they're going to deliver a set of requirements that maximizes the benefit to the business while minimizing the cost.

Justin Cave
  • 12,811
8

Having worked both sides of this issue I have to agree with the Analyst. I have seen some spectacularly poor designs resulting from lack of understanding of the capabilities of the technology. In some cases, it has been a result of taking marketing hype as truth. In general, the problem has been generating specifications which don't match the technical capabilities.

The analyst should be specifying What needs to be done, When, and by Whom. They should know Why it is being done. Development priority should be more dependent on the Why than the other factors. The design and development team need to handle the How. In order to develop cost effective systems, the analysts need to specify what needs to be done in terms that don't push the boundaries of the available technology.

Pushing the boundaries can increase costs in a number of ways, but in some cases may have a significant return. Some of the cost factors are:

  • Experimentation may be required to develop a working solution;
  • New employees or consultants with specialized knowledge may need to be acquired;
  • Training on the new technology may be needed;
  • Development tends to be slower and bug rates higher; and
  • Extra efforts may delay simpler solutions which have more immediate value.
BillThor
  • 6,310
6

If the technology that will be used is known it should be taken into consideration by analysts when creating the design. Different technologies do things differently and a design that doesn't take into account those differences is going to have problems.

However, business analysts shouldn't care about what technology is used, their job is to gather business rules and make them understandable to the technical team. Systems analysts/architects/designers or any other name they may be given should know the technologies being used and design around them because they should be the ones doing actual design, not business analysts.

Ryathal
  • 13,486
  • 1
  • 36
  • 48
6

I believe there is a point between the two lines of thought that is probably more realistic. While a high-level design might be best when kept technology agnostic, there must be a consideration of known real-world constraints and requirements that should be incorporated into the design. What level is this design? Do you have sufficient requirements? How flexible is the environment? Is management invested in a specific technical direction?

Are there no operational parameters that drive you in a specific direction? Do you have a broad array of resources capable of implementing a solution in any technology stack? Are there interoperability issues requiring access to other systems?

Answers to these questions are needed before you can definitively say whether the technology should be a part of the equation or whether the design should drive the technology selection.

Given no constraints and being a very high-level design, I might agree with your thinking that the design be truly agnostic. However, in my 20+ years of experience, I've rarely been in a situation where there were not any constraints that limited my choices -- and which drove my design toward specific technologies or technology families.

3

The ideal user interface would be where the user thinks a thought, and what they wanted done is simply done. Anything short of that is crippled by the technology limitations we have at our disposal, so of course the BA needs to understand what context they can design the system in.

gahooa
  • 2,121
2

Different technologies can have very different cost and efficiency structures for solving a given problem. These costs can include things such as hiring costs in the local area, energy and cooling costs for specific systems, existing code and existing equipment reuse possibilities, etc., etc. So, yes, perhaps one can ignore these contraints and details of specific technologies if one is working on a project where cost and efficiency are not anywhere near as important as other considerations (such as in aviation safety, nuclear plant control, medical implants, etc.). But for most business situations, management might care about the cost structure of the potential solutions versus the benefits of the system implementation.

hotpaw2
  • 7,988
1

The business analyst should know what kind of application that we are developing like *Web application / Console application / Mobile application / Reporting application etc* so that she can better come up with a nice set of features for the application or pushing back on the user on impossible expectations like 3rd level nested drag and drop (e.g) .

He/She does not need to be aware of which technology like Java/C#/Python/SQL etc.

java_mouse
  • 2,657
  • 17
  • 23
1

The analysis process itself needs to be entirely technology-agnostic. When you are researching the client and its needs, you need to do so with a completely open mind. The other side of the coin however, is that the analyst is often asked to provide recommendations and may also be required to handle system architecture also. This is an entirely different facet of the role in which a wider understanding of the available technologies is crucial, as it can make a huge difference to the customer not only in terms of the ability to get a project off the ground, but also in terms of the long term needs of the customer, and the sustainability of the project itself.

While it's true that the larger part of designing software is essentially the same regardless of the technology used, there are always areas where the design will be influenced by the choice of technology. Platform choices may influence language and API choices, while availability of expertise, support, and even cost will also have an impact on the choices made. So from one perspective, part of your position is justified in that the actual analysis should be conducted without the influence of any specific technology, however using the analysis to determine a design will always require a broader technology knowledge, so that the analyst can make recommendations which will allow the application of designs intended to meet the customer's needs.

S.Robins
  • 11,505
0

Each technology has limits and constraints, therefore it makes some sense for an analyst to consider those limits. On the other hand, an analyst who know .net well, but hasn't seen Java since the late nineties, will most likely design a .net solution - using .net terminology and design patterns - even if Java (or RoR etc.) would better fit the problem. It's relatively difficult to implement such a design in another technology later.

Therefore, I think an analyst should be agnostic when the technology hasn't been selected yet, but experienced in those cases where the choice has already been made.

user281377
  • 28,434