Monday, December 29, 2008

Creating and Installing VS Templates

Creating Templates:
http://weblogs.asp.net/scottgu/archive/2006/09/04/Tip_2F00_Trick_3A00_-Creating-Re_2D00_Usable-Project-and-Item-Templates-with-VS-2005.aspx

Installation of Class Library Template
1. In VS 2008, click on the “Tools” menu.
2. Click on the “Options” sub-menu.
3. Select “Project and Solution” -> “General”.
4. Extract the contents of the zip file and copy them to the location shown under “User projects template location:”.

Installation of Item Templates
1. In VS 2008, click on the “Tools” menu.
2. Click on the “Options” sub-menu.
3. Select “Project and Solution” -> “General”.
4. Extract the contents of the zip file and copy them to the location shown under “User item template location:”.

Tuesday, September 16, 2008

Logging Best Practices

Logging Best Practices

Applications need to log diagnostic information to detect problems when something goes wrong and unexpected things happen. Developers need extensive trace information in the logs for debugging during and after development. Good logging practices will go a long way in serving these requirements effectively.

Logging API

Use the standard logging API available in your environment.

.NET applications must use log4net. Java applications must use de-facto standard logging API - apache commons with log4j beneath it.

Logger Category Name

It is common practice to instantiate logger using fully qualified class name as the message category name. This allows the application to limit/fine tune log messages at a fine grained level of up to class name.

However, there are instances where this approach is not sufficient. For instance, the application may have to raise an alarm if the application detects an attempt to violate access permissions. For example, in a banking application, account X trying to read details of account Y may indicate a serious break in attempt. Application may have to log such attempts to a separate category named "security" with a high decibel alarm meaning, this logger category may email/SMS the security team about the incident. Such a requirement has to be analyzed per application basis and decided.

Logging Level

These are the commonly available log levels in logging frameworks. This set has been taken from apache commons logging framework. Pretty much any other logging framework's log levels can be mapped to this set.

  • FATAL
  • ERROR
  • WARN
  • INFO
  • DEBUG
  • TRACE

Very strict rules apply to the two extremes of log levels. Rest of them are very debatable and it is the aesthetic decision to be made by the team.

FATAL - Reserved for the problems which lead to complete failure of the component/application. For example, network error on database connections are FATAL. Wrong user inputs leading to problems are not fatal - they can be WARN or INFO at the max. However, this rule will slightly change in case of re-usable components because the component can not decide whether a problem is FATAL - it is the job of calling code to decide that. Therefore, re-usable code should try to avoid logging such problems and indicate the problem to outside world through exception mechanism. If you have to log it, use WARN or INFO level depending on the severity of the situation arising out of intended usage of re-usable code.

ERROR - Genuine errors arising within your code should be logged at ERROR level. Problems you catch from the code you depend on are not errors. Such problems should be logged as WARN if you can recover from it or FATAL if you decide to give up. However, this rule will slightly change in case of re-usable components because the component can not decide whether a problem is ERROR - it is the job of calling code to decide that. Therefore, re-usable code should try to avoid logging such problems and indicate the problem to outside world through exception mechanism. If you have to log it, use WARN or INFO level depending on severity of problem.

WARN - Situations where something went wrong seriously and the code can recover from it should be logged at WARN level.

INFO - As the name suggests, information should be logged at INFO level. For example, log configuration data read from configuration files at INFO level. Because, you want to inform the world about what you read from configuration file when the application started. Another example - log the exceptions you caught in re-usable components as INFO. Because you have insufficient information to decide any other severity level in the code and you want to inform that such a thing happened in the code.

DEBUG - Log low level information concerning code logic decisions, internal state etc. This should allow a developer to understand the behavior of code. It is expected to increase the log size. This log level is turned off in production deployment.

TRACE - Lowest log level. This level is used to log finest details of code flow - including method entry and exit details. This should allow a developer to debug the application code in the absence of interactive debugger. This log level is expected to generate huge logs. This log level is turned off in production deployment. [See "tracing" later in the document.]

Your Code - Logger Calls

Your code

Create a static logger instance - because you don't want to keep one logger instance per class instance. Logger instance must be final also because you don't want to keep changing the instance after it is created once.

public class MyClass {
private static final Logger LOG = LogFactory.getLog(MyClass.class);
...
...
public void myMethod() {
LOG.warning("Failed to fetch profile data.", profileException);
...
LOG.info("Redirecting to input page.");
...
}
}

Exception/throwable

Exception/throwable should be logged in its entirety - including stack trace. This is achieved by passing the exception/throwable object to the logger call as a parameter so that logger can log all the details. Logger uses this exception/throwable for logging full stack trace including the root causes wrapped in the exception until the bottom most cause of exception.

LOG.fatal("Earth has gone out of trajectory.", trajectoryException);

Don't log exceptions as shown below - that is bad practice because you lose stack trace. This will reduce the log to a mere information level without telling exactly what went wrong and where.

/* Bad practice */
LOG.fatal("Earth has gone out of trajectory. " + trajectoryException);

Conditional logging

Log statements must be enclosed in conditionals to avoid un-necessary runtime overhead of evaluating the log statement. This is preferably done using AOP to wrap all your log statements within conditionals - provided your environment supports AOP. End effect of wrapping the log statements in conditional statement should be same as the following piece of code would do (highlighted in bold). Using AOP will reduce code clutter in application code and avoid the coding effort involved.

if (LOG.isDebugEnabled()) {
LOG.debug("User state: isEmployed=" + user.isEmployed() + ", isRegistered=" + user.isRegistered());
}

Fall back to coding the conditionals in application code only when you can not use AOP because of environment restrictions. For example, client does not want to use AOP or the app server where you are deploying your application may not support AOP or your programming environment does not support AOP.

Log formatting

Don't use any kind of formatting in log statements. Log lines are not for printing purposes - it is only to help you to debug problems. Therefore, you don't need any kind of formatting in the log lines. These are all bad practices:

/* All these are bad practices */
LOG.info("######### Executing the action. #############");
LOG.debug("Line 3333333 executed.");
LOG.debug("Employee: --->" + employee.getName() + "\r\n\t" + employee.getId());

Log context

Context data must be logged in the logs - for the logs to be useful. In the above example (under conditional logging), just logging user state is useless unless you know which user's state it is - that is the context we need. This context can be a unique piece of information like user principal or session id. Similarly, thread id is also needed in the logs to trace the code flow for a specific use case execution from the user. In clustered environments, server name or IP address is also needed to identify the server. Context insertion has to be achieved in unobtrusive way - without major changes to application code for inserting the context wherever logger methods are called.

Example: Java web applications using log4j implementation can achieve this using a servlet filter for injecting context information in log4j NDC (See http://logging.apache.org/log4j/1.2/manual.html for more information).

Note: Using the context information logging features like class name, method name and line number may lead to some runtime overheads. Use this with caution. Disable such context logging in production environment after debugging is over.

Tracing

Fine grained tracing like method entry/exit should not be done in the code. Whenever possible, use AOP for inserting fine grained trace logs. This will avoid code clutter and make it easy to change the logs from one place.

Don't log sensitive data

Sensitive data should not be logged. For example, never log a user's password to the logs. There is no magic to avoid logging sensitive data - developers have to be careful. Only careful reviews can catch the problem, if it exists.

Clock sync

You must synchronize system clocks in a cluster to have right sequence of log entries when the logs are merged for log analysis. Out of sync log entries will lead to confusing situations at the best to completely failed log analysis at the worst.

Dynamic log level threshold changes

Preferably, it should be possible to change logger log level threshold dynamically. This will be very useful in production environment where restarting the app server is not possible for the new log level threshold to take effect for collecting debug information.

Logging inside loops

Avoid logging inside loops. Major problem with this is - logging can be come considerable overhead when logging is done inside loop. It will become more severe overhead if this loop is called form within another loop - you suddenly explode the log statements. Only exception to the rule is when you need it badly for debugging purposes. In that case, use DEBUG or TRACE level for logging and enclose the log statements within conditionals so that the log statements won't have any overhead in production where DEBUG level is turned off.

Logging huge objects/lists

Huge objects/lists should never be logged. It will become a major runtime overhead. It will also increase the log file size dramatically.

Log asynchronously

Enable asynchronous logging capabilities of the logger. Otherwise, your code will slow down because of logger calls. Your code may slow down significantly in inherently multi-threaded web applications.

Log rotation

Use sensible log rotation policy to avoid getting huge log files. Huge log files create problems with log analysis. It becomes impossible to wade through a multi GB log file for debugging. This may also lead to disk space problems - you may run out of disk space. Another problem with huge logs is - you may hit the file system limit of log file size leading to unexpected logger behavior.

Size based log rotation and time based log rotation are commonly used rotation policies.

Size based log rotation: This policy rotates the logs based on size. Something like, rotate logs when the log file reaches 10 MB. Logger creates a new log file whenever log file size reaches 10 MB.

Time based log rotation: This policy rotates the logs based on time. Something like, rotate logs every day. Logger creates a new log file every day.

General rule is configuring the log rotation in such a way that you can open the log file in a tool for analysis. At the same time, log file should not be too small - by spanning any of your analysis effort across multiple files. For example, log file size of 10 kB each may be too small and you may have to analyze thousands of files when you are tracing some problems for one full day. At the same time log files bigger than 50 or 100 MB may not be easy to browse using an editor on windows platform. Unix environments (including linux) are little more forgiving here because tt is also possible to grep huge files for specific lines in the log easily on Unix. Therefore, on windows platform, don't exceed 50 MB per log file. Don't exceed 1 GB on Unix environments.

You can do log rotation using logger's built-in facilities like log rotation policy. On Unix environments you can also use system wide log rotation facility (known as logrotate) in combination with logger's log rotation policy.

Log cleanup

Use scheduled scripts/scheduled jobs to archive/delete logs older than some duration to avoid running out of disk space. Keep minimum needed log files online - archive/delete older ones. If you decide to archive old logs, compress them to save storage space.

Use logrotate facility on Unix (and linux) environment for compressing, archiving and cleaning up old log files. Use a scheduled job on windows for compressing, archiving and cleaning up old logs.

Important: Both log rotation and log cleanup should be configured from dev environment to production on all servers. Starting with dev and moving to QA, staging and production will allow the process to stabilize before reaching production. This will dramatically minimize the problems you may face in production later. For example, knowing the daily log file size and putting an archival/cleaning up procedure in QA or staging will help you to avoid disk space issues on production later because you can move the same procedure to production.

Wednesday, June 11, 2008

11 More Visual Studio Shortcuts You Should Know

http://www.dev102.com/2008/05/06/11-more-visual-studio-shortcuts-you-should-know/

Tuesday, June 10, 2008

Debugging WPF Application

Tools for Debugging

Snoop -A utility to simplify visual debugging of WPF applications at runtime

Mole - It is a Visual Studio visualizer; works with WPF, WCF, WF, WinForms and ASP.NET projects. Mole not only allow the developer to view objects or data using a customized interface, but also to drill into properties of those objects and then edit them.

Other Resources

Tips for Debugging WPF bindings

General Debugging Tips


Monday, May 26, 2008

A Detailed Look into Customizing Windows Workflow Runtime

http://www.odetocode.com/Articles/457.aspx

Workflow Foundation Basics

Excerpts from here:
1. You only need one instance of the workflow runtime for each process, and you are not allowed to have more than one instance for each AppDomain. The best thing you can do here is to create the required instance directly in the form's constructor. The same runtime object can take care of a variety of workflow instances. The runtime distinguishes instances based on their GUID, and receives private data for each specific instance.


2. There are two general approaches for receiving data into a workflow from the host application when it is instantiated;—parameters and events.

3. Two Type of Workflows: Sequential and State Machine - A sequential workflow is a predictable workflow. The execution path might branch, or loop, or wait for an outside event to occur, but in the end, the sequential workflow will use the activities, conditions, and rules we've provided to march inevitably forward. The workflow is in control of the process.
A state-machine workflow is an event driven workflow. That is, the state machine workflow relies on external events to drive the workflow to completion. We define the legal states of the workflow, and the legal transitions between those states. The workflow is always in one of the states, and has to wait for an event to arrive before transitioning to a new state. Generally, the important decisions happen outside of the workflow. The state machine defines a structure to follow, but control belongs to the outside world.
We use a sequential workflow when we can encode most of the decision-making inside the workflow itself. We use a state machine workflow when the decision-making happens outside the workflow. In this chapter, we will take a closer look at how state machines operate.
Read More here.

4. When you create a new Sequential Workflow Console Application project. The Visual Studio 2005 Solution Explorer will contain two files—workflow1.cs and, initially hidden from view, workflow1.designer.cs. These two files represent the workflow being created. A Windows Workflow Foundation workflow consists of the workflow model file and a code file class. The workflow1.cs class is the code file class where you can write your own workflow business logic. The workflow1.designer.cs class represents the description of the activities map. This file is managed automatically by Visual Studio 2005 in much the same way it happens with forms in a Microsoft Windows Forms project. As you add activities to the workflow, Visual Studio 2005 updates the designer class with Microsoft C# code that programmatically builds the map of activities. To continue with the Windows Forms analogy, a workflow is like a form, whereas activities are like controls.
You can choose another form of persistence for the activity layout—the XML workflow markup format. To try this approach, you delete the workflow1.cs file from the project and add a new workflow item “Sequential Workflow (with code separation)”
Now your project contains two files—workflow1.xoml and workflow1.xoml.cs. The former contains the XML workflow markup that represents the workflow model; the latter is a code file class, and contains source code and event handlers for the workflow.


5. At run time, changes to the collection of activities are also possible, and give you the ability to make changes to a running instance of a workflow.Workflow changes are motivated by business changes that were not known at design time, or by the need for business logic that modifies and then completes the business process. In any case, it should involve limited changes;—perfecting rather than redesigning.

6. Workflow can be exposed as a webservice. The Windows Workflow Foundation framework supports Web service interoperability, which includes the ability to expose a workflow as Web service to ASP.NET clients and to other workflows. Windows Workflow Foundation supports publishing a workflow as an ASP.NET Web service on a Web server or server farm running ASP.NET on Microsoft IIS 6.0. The Windows Workflow Foundation framework activity set contains the WebServiceReceive and WebServiceResponse activities, which enable a workflow to be used as Web service endpoints.

In a real time application, it may take weeks or more for workflow to reach a final/complete state. Fortunately, state machine workflows can take advantage of workflow services, like tracking and persistence (both described in Hosting Windows Workflow). A persistence service could save the state of our workflow and unload the instance from memory, then reload the instance when an event arrives weeks later.






Friday, May 16, 2008

Passionate about ?

I have always been passionate about three things; Cooking, Singing & Acting, but by profession, i am a software engineer and i do enjoy it. I have been in this profession since four years. But unfortunately i had to take a break for few months because the company i was working for two years could not assign work to me any more.
I was so upset because i would earn nothing for a period and had a sense of job insecurity too (i knew, finding a new job was not a big deal, for me but i am in US for just next two months and for this small period i would not struggle).

Then god shown me a ray of hope. I was cooking a lot since i moved to US, so i just thought of starting cooking classes because i had seen many people who would love Indian and Vegetarian food here. I just planned and posted some fliers at few places and some people seemed to be interested. And now i am enjoying welcoming and introducing people to the world of spices, Indian Cooking!!! Such a nice break!!!

I never thought that i would really ever get chance to do something interesting in any one of my passions but wow, i got a chance to do lots of cooking and now i am running cooking classes for a while.

What i have learned from this situation is "never get upset and believe in yourself", this belief can turn worst thing into the best thing in your life. grin !!

Wednesday, May 14, 2008

Object Oriented design???

As per my understanding following points should be Kept in mind while designing an OO system:

1. Encapsulate what varies: This makes code maintenance easy because by encapsulating what varies we have moved the code more susceptible to changes to one place and it can be modified without affecting the stable code, if needed in future. This way we have introduced low coupling b/w the code which varies and the code which stay same.

2. Program to an interface not an implementation: Stable code is the one which is closed for modification but open for extension. So by programming to an interface or abstraction we can actually modify the actual implementation anytime when required without affecting the code that uses interface. So this way we have flexible and extensible code.

3. Favor composition over inheritance: It allows to delegate some of the responsibilities to the composition objects.This way we can make
changes to composition object implementation independently.

4. Classes should be open for extension but closed for modifications.

There is more to come...

Thursday, May 8, 2008

IQueryable and IQueryProvider

This post by Bart Linq to Active Directory gives good insight of how linq to anything works but it was based on .Net 3.5 Beta1 release and used IQueryable interface because IQueryProvider was introduced while Beta2.

Actually, the IQueryable interface was factored into two interfaces while Beta2 release:

1. IQueryable
2. IQueryProvider

If you use Visual Studio to ‘Go to definition’ you get something that looks like this:

public interface IQueryable : IEnumerable {

Type ElementType { get; }

Expression Expression { get; }

IQueryProvider Provider { get; }

}

public interface IQueryProvider {

IQueryable CreateQuery(Expression expression);

IQueryable CreateQuery(Expression expression);

object Execute(Expression expression);

TResult Execute(Expression expression);

}

Through Linq to Amazon example you can see how it would change the Linq to Active Directory implementation by Bart.

Good resource: Linq to Everything: A List of LINQ Providers


ASP.Net 2.0 Compilation Model

The ASP.NET compiler (aspnet_compiler.exe) was originally introduced in ASP.NET 2.0 as a way of completely precompiling an entire site, making it possible to deploy nothing but binary assemblies (even .aspx and .ascx files are precompiled).
This is compelling because it eliminates any on-demand compilation when requests are made, eliminating the first postdeployment hit seen in some sites today. It also makes it more difficult for modifications to be made to the deployed site (since you can't just open .aspx files and change things), which can be appealing when deploying applications that you want to be changed only through a standard deployment process.
The compiler that ships with the release version of ASP.NET 2.0 supports this binary-only deployment model, but it has also been enhanced to support an updatable deployment model, where all source code in a site is precompiled into binary assemblies, but all .aspx and .ascx files are left basically intact so that changes can be made on the server (the only changes to the .aspx and .ascx files involve the CodeFile attribute being removed and the Inherits attribute being modified to include the assembly name). This model is possible because of the reintroduction of inheritance in the codebehind model, so that the sibling partial classes containing control declarations can be generated and compiled independently of the actual .aspx file class definitions.


So, there are three Deployment Mode; vs2005 deploys a site using updatable mode by default but using aspnet_compiler.exe utility one can use other two option as well that is all source and all binary.

All Source Mode: This mode is generally used while in development phase.

All Binary Mode
When the binary deployment option is used

aspnet_compiler -v /MyWebSite -p "F:\Samples\MyWebSite" "C:\Inetpub\wwwroot\MyWebSite". It results in
1) The .aspx files in the deployment directory, they are just marker files with no content. They have been left there to ensure that a file with the endpoint name is present in case the "Check that file exists" option for the .aspx extension in an IIS app is set.
2) The PrecompiledApp.config file is used to keep track of how the app was deployed and whether ASP.NET needs to compile any files at request time.
3) A precompiled dll which includes all the codebehind files for .aspx(separate assembly per directory) , .ascx(separate assembly per directory) , .master files and App_Code folder.

Updatable (mixed) Mode

To generate the "updatable" site, you would add a -u to the command line, and the resulting .aspx files would contain their original content (and not be empty marker files). Note that this functionality can also be accessed graphically through the Build | Publish Web Site menu item of Visual Studio 2005, as you can see in Figure 5. Both the command-line tool and Visual Studio rely on the ClientBuildManager class of the System.Web.Compilation namespace to provide this functionality.

Deploying a website executing the following command
aspnet_compiler -v /MyWebSite -p "F:\Samples\MyWebSite" -u "C:\Inetpub\wwwroot\MyWebSite"

results in a precompiled dll which includes all the codebehind files for .aspx(separate assembly per directory) , .ascx(separate assembly per directory) , .master files and App_Code folder.


Things to be cautious
Another thing to keep in mind when considering the file-to-assembly mapping is that the use of the internal keyword to prevent external assemblies from accessing methods in your classes may work in some deployment scenarios and not others, because of the different assembly mapping options. Unless you plan ahead of time which deployment option you will be using, it is probably best to avoid internal methods in your pages and stick to the type-scoped protection keywords: public, protected, and private.

How often do you let other people

I have found an interesting read and posting here as it is, so i do not lose it.

How often do you let other people's nonsense change your mood? Do you let a bad driver, rude waiter, curt boss, or an insensitive employee ruin your day?

However, the mark of a successful person is how quickly she can get back her focus on what's important.

Sixteen years ago I learned this lesson. I learned it in the backseat of a New York City taxi cab. Here's what happened.

I hopped in a taxi, and we took off for Grand Central Station. We were driving in the right lane when all of a sudden, and I mean without warning, a black car jumped out of a parking space right in front of us. My

taxi driver slammed on his brakes, skidded and missed the other car's back end by just inches.

Here's what happened next. The driver of the other car, the guy who almost caused a big accident, whipped his head around and he started yelling bad words at us.

Now, here's what blew me away. My taxi driver just smiled and waved at the guy. And I mean, he was friendly. So, I said, "Why did you just do that?

This guy almost ruined your car and sent us to the hospital!" And this is when my taxi driver told me what I now call, " *The Law of the Garbage Truck."**

*

Many people are like garbage trucks. They run around full of garbage,

full of frustration, full of anger, and full of disappointment. As their

garbage piles up, they need a place to dump it. And if you let them, they'll

dump it on you.

When someone wants to dump on you, don't take it personally. You just smile,

wave, wish them well, and move on. You'll be happy you did. I guarantee it.

So this was it: The "Law of the Garbage Truck." I started thinking, how

often do I let Garbage Trucks run right over me? And how often do I take

their garbage and spread it to other people: at work, at

home, on the streets? It was that day I said, "I'm not going to do it

anymore."

Well now "I see Garbage Trucks." I see the load they're carrying. I see them

coming to drop it off. And like my Taxi Driver, I don't make it a personal

thing; I just smile, wave, wish them well, and I move on.

Good leaders know they have to be ready for their next meeting.

Good parents know that they have to welcome their children home from school

with hugs and kisses.

Leaders and parents know that they have to be fully present, and at

their best for the people they care about.

The bottom line is that successful people do not let Garbage Trucks take over their day. What about you? What would happen in your life, starting today, if you let more garbage trucks pass you by?

Here's my bet. You'll be happier. I guarantee it.

Trip to Victoria, BC, Canada

Last weekend, we had been to Canada for the last time, as our visa would expire on 9th may.
We went to Victoria city (situated on Vancouver island), BC state, Canada and that trip was the most exciting and exhilarating, we ever had in Canada.

It is such a beautiful city, and the wealthiest city i have ever seen.We left saturday morning 4 am and took a ferry after entering the Canada, Vancouver city (by car) around 7am and reached Victoria (Vancouver island) around 9am.
Vancouver island and Vancouver city are different, but both in BC state, so do not get confused.
Then we went to the miniature world and after spending 2.5 hours there we just walked and reached the wax museum and then back to hotel. we all were very tired by that time but did not sleep. Rather we ate something and then just walked to get look and feel of the city. While returning back to hotel Anjal (my kiddo) fell asleep in my arms, we also went to bed just after doing little planning for the next day.

Sunday, we got up early and left hotel around 6:45 am just to see the beaches; all were on the same drive called as beach drive. Our eyes captured the real beauty, wealth (houses; very big and beautiful and the whole infrastructure was so good) of the city while driving around the city and reaching various beaches and several view points in between.

Then we returned back to hotel and after getting refreshed did checkout from hotel and targeted for our next spot that was "The Butchart Garden"; it was very huge, full of myriads of flowers. We reached there 12:30 pm and left around 4 pm to take the return ferry starting at 5 pm to Vancouver city (Bc, canada).
we drove from Vancouver city to Anacortes (where we live) for about 2.5 hours and then we were at sweet home. Fully tired but satisfied with the weekend. :)


Thursday, May 1, 2008

What matters for you? Money in pocket OR Brain in mind!!!

Software engineers earn good money. And i have seen many people who are willing to have or get very quick promotions. But i don't want to be a manager soon. I want to spend much time as a developer. I want to learn and understand so many things because i think manager do not get time for learning and have busy schedules, sometimes i think how do they get time for their personal life, family? i have seen people who have calls untill late night. Life is not just about spending weekends with family, is it? On weekdays working hours should be normal for anyone, not more than 9-10 hours.

Today, i was reading an article and i liked one line (title of this article) very much and thought for a while about it. I realised that it says a lot.

I think, If we are running madly after promotion (indirectly after money, i do not mean that promotions are bad but they should be through usual ladder and it varies from person to person), we are probably gaining nothing but compromising our peaceful life and losing valuable moments of life just to get early promotions. And that too for something which will not stay longer; we will spend it somewhere pretty soon.

I would prefer doing things using brain (no hurries) and working for myself not just for the purpose of completing an assigned task and pleasing my manager. While doing my job i want to learn as much as possible and know internals of technologies (because it gives me satisfaction), does not matter if it earns me less money.
And i have experienced that willingness to gain detailed knowledge gives ultimate satisfaction and that is what we want from our lives.

I know some of the readers might not agree with what i said but this is what i think and we all free to write what we think? :)

My choice is brain in mind over money!!!!!

Thursday, April 24, 2008

LINQ to SQL vs Entity Framework and LINQ to Entities

Here are the simple and clear differences b/w LINQ to SQL, Entity Framework and LINQ to Entities by Michael Pizzo, posting here so that i do not lose them.

LINQ to SQL and the Entity Framework have a lot in common, but each have features targeting different scenarios in the Orcas timeframe.

LINQ to SQL has features targeting "Rapid Development" against a Microsoft SQL Server database. Think of LINQ to SQL as allowing you to have a strongly-typed view of your existing database schema. LINQ to SQL supports a direct, 1:1 mapping of your existing database schema to classes; a single table can be mapped to a single inheritance hierarchy (i.e., a table can contain persons, customers, and employees) and foreign keys can be exposed as strongly-typed relationships. You can build LINQ queries over tables/views/table valued functions and return results as strongly typed objects, and call stored procedures that return strongly typed results through strongly typed methods. A key design principle of LINQ to SQL is that it "just work" for the common cases; so, for example, if you access a collection of orders through the Orders property of a customer, and that customer's orders have not previously been retrieved, LINQ to SQL will automatically get them for you. LINQ to SQL relies on convention, for example default insert, update, and delete logic through generated DML can be overwritten by exposing appropriately named methods (for example, "InsertCustomer", "UpdateCustomer", "DeleteCustomer"). These methods may invoke stored procedures or perform other logic in order to process changes.

The Entity Framework has features targeting "Enterprise Scenarios". In an enterprise, the database is typically controlled by a DBA, the schema is generally optimized for storage considerations (performance, consistency, partitioning) rather than exposing a good application model, and may change over time as usage data and usage patterns evolve. With this in mind, the Entity Framework is designed around exposing an application-oriented data model that is loosely coupled, and may differ significantly, from your existing database schema. For example, you can map a single class (or "entity") to multiple tables/views, or map multiple classes to the same table/view. You can map an inheritance hierarchy to a single table/view (as in LINQ to SQL) or to multiple tables/views (for example, persons, customers, and employees could each be separate tables, where customers and employees contain only the additional columns not present in persons, or repeat the columns from the persons table). You can group properties into complex (or “composite”) types (for example, a Customer type may have an “Address” property that is an Address type with Street, City, Region, Country and Postal code properties). The Entity Framework lets you optionally represent many:many relationships directly, without representing the join table as an entity in your data model, and has a new feature called "Defining Query" that lets you expose any native query against the store as a "table" that can be mapped just as any other table (except that updates must be performed through stored procedures). This flexible mapping, including the option to use stored procedures to process changes, is specified declaratively in order to account for the schema of the database evolving over time without having to recompile the application.

The Entity Framework includes LINQ to Entities which exposes many of the same features as LINQ to SQL over your conceptual application data model; you can build queries in LINQ (or in “Entity SQL”, a canonical version of SQL extended to support concepts like strong typing, polymorphism, relationship navigation and complex types), return results as strongly typed CLR objects, execute stored procedures or table valued functions through strongly-typed methods, and process changes by calling a single save method.

However, the Entity Framework is more than LINQ to Entities; it includes a "storage layer" that lets you use the same conceptual application model through low-level ADO.NET Data Provider interfaces using Entity SQL, and efficiently stream results as possibly hierarchical/polymorphic DataReaders, saving the overhead of materializing objects for read-only scenarios where there is no additional business logic.
The Entity Framework works with Microsoft SQL Server and 3rd party databases through extended ADO.NET Data Providers, providing a common query language against different relational databases through either LINQ to Entities or Entity SQL.

So while there is a lot of overlap, LINQ to SQL is targeted more toward rapidly developing applications against your existing Microsoft SQL Server schema, while the Entity Framework provides object- and storage-layer access to Microsoft SQL Server and 3rd party databases through a loosely coupled, flexible mapping to existing relational schema.

Conclusion by me:

1. The big difference b/w LINQ to SQL vs Entity Framework is that the EDM is more flexible and more loosely coupled because of the 3-tiered model (conceptual layer, source schema and the mapping layer in between). For example, you could have one conceptual layer and then multiple mapping layers that point to mulitple databases. In LINQ to SQL, your dbml properties are tightly bound directly to one particular field in a table.

2. You can query entities using either Linq or Entity SQL (This can be either through the object services API or the Entity CLient, the one which gives you connections/commands and results in a dbDataReader).

Entity SQL, while not as elegant as using the strongly typed LINQ, has the advantage of enabling dynamic queries, since you use a string to build a query, much like TSQL.

Most Precious Gift : To be with my Angel !!!!

Almost seven months, i have been living in city Anacortes situated on the Fidalgo island, in Washington State. This is the first time i got to live in a country other than my home country, India.
It is such a beautiful city i had never imagined of, specially during Spring its full of flowers along with the everlasting scenic beauty of sea shores, mountains and trees.
I just walk out of my apartment and look around, it gives me immense pleasure to be here.

This has been amazing experience of my life. I have realized few things which seems to be very important in one's life, if not to others at least to me they are.

The most precious gift was to have plenty of time to spend with my toddler boy.
I loved every moment here with him, i actually got to see him growing and learning new things.
I and my husband have been enjoying to find new games for him every day and have a feel to be a child again. Sometimes i think, there are so many crazy things which parents and child both can enjoy.

One day, my husband just picked up a balloon and started drawing something, my kiddo liked it so much that he would
always ask him or me to draw all the animals (he knew) one after another.
Just to add, He has just turned two this April but knows so many animals, almost all the colors, basic shapes, 1-10 counting and A-Z alphabets, few words and few sentences. huh!!!!
He is bilingual, speaks Hindi and English (pronunciation is not very good though). I am sure, when i was two, i would hardly know half of he knows.
But yes, this Balloon drawing lasted very long, almost a month and he loved lion and elephant the most.

He loves reading books. He has almost 20 books, i never had that much, smile... He spend most of the time with books only at his day care. Also plays with blocks and go out, play with ball.

Spring has just started and we go out in the evening, mostly go to outside play area of the day care, my little boy goes to. Here are too many public play areas for children, so he has so many choices. He enjoys swings/slides ride there a lot.
Very recently, we discovered an interesting exercise for him when we were looking at the sky once. The clouds were so nice we just kept watching for few minutes and started to imagine different shapes in them. When we showed the same to our son, he was so delighted to see all the animals he always loved, by his imagination.

I would say, in India, it was impossible because the life would be so busy. In fact, i never had spent much time with my family (at least on weekdays). Most of the time goes in work place and traveling to work just because of traffic on roads.

Here, usual work time is 8am to 5pm (nice early to work and early to home tradition) and after that all time is yours. play, cook, eat and enjoy!!!!!

Friday, April 11, 2008

ASP.Net Dynamic Data Support

What ASP.Net Dynamic Data support does?


  • Automatically renders fully functional editable pages that are dynamically constructed from your ORM data model ('Linq to Sql' or 'Linq to Entity') meta-data.

  • Provides automatic UI validation support based on the constraints set on your data model classes from the database, if not modified by you. These data constraints can be modified by extending a model class (implementing its partial class). No need to put validation checks in aspx code.
Following snippet validates product name and
renames column UnitsInStock and defines range for it.

[MetadataType(typeof(Product_MD))]
public partial class Product
{
partial void OnProductNameChanging(string value)
{
if (value[0] == 'a')
throw new Exception("Must not start with a");
}
}

public class Product_MD {
[DisplayName("My units")]
[Description("This number shows how many unites are left")]
[Range(0, 150, ErrorMessage = "Incorrect range for the units in stock")]
public object UnitsInStock { get; set; }
}

The Product_MD is an extra class, the reason why it is here is that there is no way to add attributes to properties defined on the other side of the partial class (which is the generated model file).

  • Provides support for modifying field/page templates and also integrating third party templates (both field and page level) very easily.
Following snippet shows how to apply 'Dbimage' field template on picture column of Category table
[MetadataType(typeof(CategoryMetadata))]
public partial class Category
{
}

public class CategoryMetadata {
[UIHint("DbImage")]
public object Picture { get; set; }
}

Here Scott Hunter's talks about adding image support using DD field templates support.

  • Watch this Video presented at MIX 08 for detailed introduction of Dynamic Data Support Feature.
  • Also David Ebbo's blog mentions changes between Dynamic Data December Preview and April Preview.


Silverlight 2.0

Good Resources:

Other Resources:

Silverlight.net Forums
Microsoft’s
Silverlight.net Community
Microsoft’s
Silverlight product site
Scott Guthrie’s Silverlight tutorials and links

ASP.Net 3.5 MVC Framework

As the name suggests, MVC (Model View Controller) framework gives you the ability to separate Data and DataBinding logic from the actual Presentation of data quite easily but the biggest win is that you can test the controller entirely independent of the view so that you can perform real unit tests that might be bit harder to perform on the web forms.

Good Resources:

Wednesday, April 9, 2008

Common Google Search Keywords

There is an article on codeproject which discusses about common search patterns that we might have been using regularly with Google Search Engine. Here is the list of patterns:

1. Convert Currency

Suggested Search Keywords: convert 100 USD in indian rupees

2. Mathematical Calculations
Suggested Search Keywords: 1 ft in cm

3. Searching in a Specific Domain
Suggested Search Keywords:
GetCurrentThread + site:*.microsoft.com




JQuery and ASP.Net

These tutorials cover the fundamentals of the jQuery library

Using JQuery for AJAX in ASP.NET

There is a nice article on codeproject that helps you to get started using JQuery in your ASP.Net application, though it is ASP.Net 1.1 based but can be upgraded to ASP.Net 2.0 easily using vs2005.

Tuesday, April 8, 2008

Evaluation of JavaScript Libraries

Here is a very good comparison/evaluation of 5 different Java script libraries.

ASP.Net 2.0 Security Videos

I found few informative but small videos on ASP.Net 2.0 security by Keith Brown. It is worth watching.


ASP.Net Books

  • Essential ASP.NET by Fritz Onion: Focuses more on the underlying architecture. It has two flavors for a c# developer.
    • Essential ASP.NET with Examples in C# (ASP.NET 1.1 but still useful)
    • Essential ASP.NET 2.0
  • ASP.NET 2.0 MVP Hacks: Describes in detail, how tricks might become creative solutions?
  • Programming ASP.NET by Jesse Liberty and Dan Hurwitz
  • Microsoft ASP.NET Coding Strategies with the Microsoft ASP.NET Team by Gibbs and Howard: Not really an introductory book.
  • Programming Microsoft ASP.NET 2.0 Applications: Advanced Topics by Dino Esposito. As the title suggests, it’s not an introductory book. This is much more focused on topics such as building handlers and controls etc.