Microsoft Visual C++ MVP Award for 2015

Hi community,

It gives me great pleasure to announce that Microsoft has awarded me with an MVP recognition once again in the Visual C++ category. This is my tenth consecutive time as an MVP award recipient. Wow!!! Who woulda have imagined, MVP award recipient for a decade :-) I am thrilled and excited about this, because it is hard to be awarded for the first time even harder to continue being awarded the following years. Before, I continue writing this blog post I would like to thank God and his son Jesus in the first place, Microsoft for valuing and appreciating my time, effort and contributions, Developer community because you are the reason for the MVP program to exist, and by helping you I help myself , hence the insatiable appetite for knowledge and ideas sharing is always there, and last but not least, to my family, my wife and two daughters, thanks for your understanding and patience with me whenever I can’t go out to finish writing or presenting something… Thank-you all!!!

Image (13)

In this opportunity, I will not say or mention how awesome or powerful C++ is because you must know it already, what I would love to do though is to reflect on the past 10 years as an MVP, how the program has helped lots of people (including me) and what lies ahead with native code and C++.

The first time I ever gave a talk was back in 2002 for the Visual FoxPro community in Venezuela, then I started to present for the MUG communities all along Venezuela and South America. To me, communities represented my playground, a place where I could talk in “Geek” language, share experiences with peers and learn, at the same time it helped me to get rid of my introvert-self… Yes, I was an introvert but the need to present helped me overcome it and I became an extrovert.

I was awarded the first time back in January 2006 and I have been an MVP recipient since then. I came to Australia in February 2008 because a consultancy firm based in Sydney CBD found my blog, and they were keen enough to sponsor me, seven years later I can proudly say that I’m Venezuelan-Australian and my children our Australian. Why made me any special back then for this Australian company sponsor me here? My status as an MVP.

To me being an MVP is more important than any recognition, because it is about my contributions towards society as an individual, a good example of this is how my inputs and comments are taken into consideration by the evolution working group of the ISO committee (C++ Language). Some people ask me about whether I get paid by Microsoft or why haven’t I joined Microsoft and similar questions. My answer pretty much is the same… I rather be an independent vendor and/or consultant who enjoys doing research, learning and sharing my experiences. It is more fun, trust me.

I used to spend a lot of time in forums and newsgroups (long before Stackoverflow even existed) and NNTP was the only channel available to seek and provide help. In recent years, I have been the kind of “blogging” dude instead of being in forums, reason being is that like everything in life, it was one stage I had to go through, learn from it and move on. I find blogging more interesting… Heaps more.

I must have been 16 when MIcrosoft released Microsoft C/C++ 7.0 (and it introduced MFC 1.0) and I learned every book available about C and C++. In those days, I also experienced a bit of assembler, even when it was a great learning experience it could have been much better if someone like an MVP was available, but the MVP program was introduced a year later in 1993 and CompuServe was not available in Venezuela either.

20 something years have passed, and I am still passionate about C++ and programming in general like I was back in 1992 (to those wondering about my age, I am 38 years old as I type this – 1976 model, and I have dedicated my life to computers since age 12).

Moving on… What lies ahead for native code and C++… Interesting question, and I will try to be nonbiased on this one. C++ has been evolving since its conception and the language has become better and better, but not only the language itself but the tools. Native code has not gone anywhere, it has always been there (if you have a PC, Mac, Mobile device, etc. whatever the OS is running, it is native). Managed languages are great for RAD and common development tasks, but if you want to have full access to hardware, bare metal and be in control of your program and how it behaves, then native code and C++ is the way to go.

C and C++ were one of the first portable languages (long before any VM, CLR or similar) and they will always be in demand. Mobility is considered by many the new computing wave, and as a developer if you want to make the most of your device battery, well, C++ is the right choice which by the way, in Visual Studio 2015 will allow cross-platform development, so as you can all see. C++ and native code are here to stay.

Once again thank-you all and I look forward to contributing to community this year 2015.

Angel

Handy PowerShell script to crawl file shares and collect information on their structure

 

Hi Community,

I’ve been a bit busy lately (work and family related with the arrival of my second children) but it’s always great to share information with you. A couple of weeks ago, I was working on this engagement to decide whether to move to the cloud or not (It wasn’t a brainer for me taking into consideration the existing technology stack my customer has, but more importantly I needed to collect evidence or facts in order to help them make a decision).

My customer has quite some file shares spread across Windows and multiple NAS. I had to write a PowerShell script (see below) for this task. Personally, I don’t like any scripting language mainly because I expect my code to be compiled, linked besides being able to tweak my code to perform as I want it too, but like I said it’s my personal opinion.

I hope you’ll find it useful.

param([String]$targetServer=“Win2012R2″)

CLS

$shareCount = 0
$startAnalysis = (Get-Date)
$loggedOnUser = Get-WMIObject -class Win32_ComputerSystem | select username

Write-Host “**************************************************************************”
Write-Host “**        File Share Report – Shares on: $targetServer              “
Write-Host “**        Currently logged on as: $loggedOnUser                     ”
Write-Host “**        Analysis start: $startAnalysis”
Write-Host “**************************************************************************”

If (-Not [string]::IsNullOrEmpty($targetServer)) {
 
$existingShares = Get-WmiObject -class Win32_Share -computername $targetServer `
    
-filter “Type=0 And Name Like ‘%[^$]’ And Name <> ‘NETLOGON’ And Name <> ‘SYSVOL'”
    
    
foreach($share in $existingShares) {
       
$results = $null
       
$uncPath = “\\$targetServer\\”+$share.Name
       
       
# Let‘s display information on fileshares on target server (Summary)
        $stats = dir $uncPath -recurse -errorAction “SilentlyContinue” | `
            where {-Not $_.PSIsContainer}| Measure-Object -Property length -Sum
        $results = New-Object -TypeName PSObject -Property @{
                        Computername=$targetServer
                        LocalPath=$share.Path
                        UNCPath=$uncPath
                        SizeKB=[math]::Round(($stats.sum/1KB),2)
                        NumberFiles=$stats.count
            }
       
         $results | Format-Table ComputerName, LocalPath, UNCPath, SizeKB, NumberFiles
        
         # Let’
s display a breakdown on file types and their sizes
        
Get-ChildItem -Path $uncPath -Recurse |
           
Where-Object { !$_.PSIsContainer } |
               
Group-Object Extension |

         Select-Object @{n=“Extension”;e={$_.Name -replace ‘^\.’}}, `
                       
@{n=“Size(MB)”;e={[math]::Round((($_.Group | Measure-Object Length -Sum).Sum / 1MB), 2)}}, 
                      
@{n=“Average Size(MB)”;e={[math]::Round((($_.Group | Measure-Object Length -Average).Average / 1MB), 2)}},
                      
@{n=“Maximum Size (MB)”;e={[math]::Round((($_.Group | Measure-Object Length -Maximum).Maximum / 1MB), 2)}},  #, 
                      
Count | Format-Table
        
        
        
# Let‘s display a breakdown on files in shares by age bucket
         $files=dir $uncPath -recurse |
         Select Fullname, CreationTime, LastWriteTime, Length,
            @{Name=”Age”;Expression={(Get-Date)-$_.LastWriteTime}},
            @{Name=”Days”;Expression={[int]((Get-Date)-$_.LastWriteTime).TotalDays}}
       
         $summary=@{
                Path=$uncPath
                OverAYear = ($files | Where {$_.Days -gt 365} | Measure-Object).Count
                ‘
_365Days‘ = ($files | Where {$_.Days -gt 180 -AND $_.Days -le 365} | Measure-Object).Count
                ‘
_180Days‘ = ($files | Where {$_.Days -gt 90 -AND $_.Days -le 180} | Measure-Object).Count
                ‘
_90Days‘ =  ($files | Where {$_.Days -gt 30 -AND $_.Days -le 90} | Measure-Object).Count
                ‘
_30Days‘ =  ($files | Where {$_.Days -gt 7 -AND $_.Days -le 30} | Measure-Object).Count
                ‘
_7Days‘ =   ($files | Where {$_.Days -gt 0 -AND $_.Days -le 7} | Measure-Object).Count
         }
       
        $ageBucket =  New-Object -TypeName PSObject -Property $summary | Select Path, OverAYear, _365Days, _180Days, _90Days, _30Days, _7Days
       
        [Console]::Write(“`n`r*****************************`nFiles grouped by age (Count)`n*****************************`n`rPath: {0}`nOver a year: {1}`n365 days: {2}`n”, $ageBucket.Path, $ageBucket.OverAYear,  $ageBucket._365Days)
        [Console]::Write(“180 days: {0}`n90 days: {1}`n30 days: {2}`n”, $ageBucket._180Days, $ageBucket._90Days,  $ageBucket._30Days)
        [Console]::Write(“7 days: {0}`n”, $ageBucket._7Days)
       
        $shareCount += 1
       
        Write-Host “——————————————————————————“
     }
    
     $endAnalysis = (Get-Date) – $startAnalysis
     Write-Host “Total Shares: $shareCount”
     Write-Host “Analysis Execution Time: $endAnalysis”
} Else {
   Write-Host “**  targetServer needs to be specified.    **” -ForegroundColor Red

}

 

.NET Crash Dump and live process analysis via clrmd

Application debugging and analysis can be a daunting task, even more when source code neither symbols are not available. Visual Studio provides developers with powerful debugging capabilities, but the problem many times faced by developers is that Visual Studio is not installed on the target computer Sad smile which is fair enough if it is a production environment.

There are a few tools available, being WinDbg the most powerful and one of my favorite ones. WinDbg allows developers to debug native (in kernel and user mode) and managed code through SOS (Debugging Extension). The options available are more powerful than the ones provided by Visual Studio debugger, however it might not be very user friendly, for the following reasons:

  • Load SOS via CLR or mscorwks (depending on the version of the framework)
  • Type in commands in WinDbg to perform our analysis. These commands are powerful if the developer knows them besides having a good understanding of how the CLR works, the only thing is that many of these commands’ names are very user friendly, as shown below
.load C:WINDOWSMicrosoft.NETFrameworkv2.0.50727sos.dll


!dumpheap -type MyBusinessObject

PDB symbol for mscorwks.dll not loaded

 Address       MT     Size

027437e4 01d6683c       12

02743830 01d6683c       12

0274387c 01d6683c       12

...

02747d6c 01d6683c       12

02747db8 01d6683c       12

02747e04 01d6683c       12

02747e50 01d6683c       12

02747e9c 01d6683c       12

02747ee8 01d6683c       12

total 30 objects

Statistics:

      MT    Count    TotalSize Class Name

01d6683c       30          360 FinalizerProblem.MyBusinessObject

Total 30 objects


!gcroot 02747d6c

Note: Roots found on stacks may be false positives. Run "!help gcroot" for

more info.

Error during command: warning! Extension is using a feature which Visual Studio does not implement.


Scan Thread 7092 OSTHread 1bb4

Scan Thread 6864 OSTHread 1ad0

Finalizer queue:Root:02747d6c(FinalizerProblem.MyBusinessObject)


!finalizequeue

SyncBlocks to be cleaned up: 0

MTA Interfaces to be released: 0

STA Interfaces to be released: 0

----------------------------------

generation 0 has 0 finalizable objects (002906d0->002906d0)

generation 1 has 36 finalizable objects (00290640->002906d0)

generation 2 has 0 finalizable objects (00290640->00290640)

Ready for finalization 69 objects (002906d0->002907e4)

Statistics:

      MT    Count    TotalSize Class Name

7b47f8f8        1           20 System.Windows.Forms.ApplicationContext

...

7910b694       10          160 System.WeakReference

7b47ff4c        4          224 System.Windows.Forms.Control+ControlNativeWindow

01d6683c       22          264 FinalizerProblem.MyBusinessObject

01d65a54        1          332 FinalizerProblem.Form1

7b4827e8        2          336 System.Windows.Forms.Button

7ae78e7c        8          352 System.Drawing.BufferedGraphics

...

Total 105 objects

There is a great article on MSDN on this subject – Debugging Managed Code using the Windows Debugger.

So, WinDbg is very powerful but some developers might not find it user friendly, what options do we have then? Well, good news is that Microsoft has produced and released a library to diagnose and analyze CLR applications, it is called CLrMD (CLR Memory Diagnostics). It is currently in beta and available to download from nuget.

image

Image 1 – Install nuget package

Therefore, I have just built an utility to showcase some of the features in the library. The utility is a WPF C# application which implements the ClrMD library as well as the MVVM pattern. The whole idea is to make developers life easier, by providing an easy to use UI and encapsulate some of the commands in SOS as operations that can be selected on the user interface.

image

Image 2 – Options available in the utility

The utility as of now, provides 3 operations only which are:

  • Dump Heap
  • Heap Stats
  • Threads and StackTrace

These options can be expanded by making changes to the DebuggerOption ViewModel to add a new option, and by implementing the required code in CorDbg.Operations class (Depicted below code for collecting information threads and stack trace information of the attached process.

public ObservableCollection<Thread> GeThreadsAndStackTrace() {

            var retval = new ObservableCollection<Thread>();


            if (Session != null) {

                Session.Runtime.Threads.ToList().ForEach(x => {

                    var newItem = new Thread() {

                        ThreadId = string.Format("{0:X}", x.OSThreadId).Trim(),

                        ThreadExecutionBlock = string.Format("{0:X}", x.Teb).Trim()

                    };


                    x.StackTrace.ToList().ForEach(z => newItem.StackTrace.Add(new StackTrace() {

                        InstructionPtr = string.Format("{0,12:X}", z.InstructionPointer),

                        StackPtr = string.Format("{0,12:X}", z.StackPointer),

                        Method =  (z.Method != null ?  z.Method.GetFullSignature() : string.Empty)

                    }));


                    retval.Add(newItem);

                });

            }


            return retval;


        }

The operation workflow is as follows:

  • Refresh process list – if required (This is if the target application was launched after the utility was running)
  • Select the mode to attach to the target process.
  • Select operation and click on the Go button… That simple!

In this example, the target application was another instance of Visual Studio.

image

Image 3 – Managed Heap Stats

image

Image 4 – Running threads and their stack traces

I hope you’ll find this utility useful, please feel free to download it and extend it.
[office src=”https://onedrive.live.com/embed?cid=2FE1291768841ACE&resid=2FE1291768841ACE!5826&authkey=!ADRH2VtXOaWWuRU” width=”98″ height=”120″”]

NDepend 5.3

I have always considered software development as an art and a science at the same time, in which developers translate business requirements into computer instructions, therefore creativity, innovation and technology are amalgamated into one, but the road from an idea’s inception to design then development is long and amazingly not perfect, hence we developers sometimes rush things to meet deadlines or just build the functionality with the “promise of refactoring and/or improving it” later.

As a software developer and architect, I must conduct code review for new and existing codebases. One of the challenges I have always faced is the need to query codebases, in order to find dependencies and ensure that the code and/or changes being produced will not affect or break some OOP principles neither will introduce dependency issues.

There are a few tools out there I use to check source code quality, but none of them it’s as powerful or flexible enough as NDepend.  I have been a NDepend user for a long time now, and I even posted an article about it a few years back, and besides having a refreshed user interface, it now supports CQLINQ as well as CQL (Code Query Language) being the latter for legacy and compatibility reasons.

Once we start NDepend, the start page is displayed. From it, we can start analyzing our code as well as installing add-ins for Reflector and Visual Studio.

image

NDepend smoothly integrates into Visual Studio since version 2008, thus making it easier for developers.

image

There are many features in the product, but just to mention a few:

  • Multi VS solutions wide-analysis and collaboration
  • Rich Code Search in VS
  • Multi Query Edition in VS
  • Reflector disassembly’s comparison
  • Continuous comparison with a base line in VS
  • Code visualization in VS
    - Dependency Matrix
    - Dependency Graph and Metric View

image

The Dashboard allows developers to have a quick look at how their codebase is structured, but more importantly the incurred rule violations in code. Information is displayed in a succinct and clear way, not to mention the graphics that make it easy to interpret our code better.  

methods_that_could_have_a_lower_visibility

I have briefly described some of the features and benefits of NDepend, but before I forget to mention, it is available to download as a 14-day trial so you can use it and get a better idea about the capabilities of the product.

Regards,

Angel

Read XML config files with Visual C++

Hi Community,

I am currently working on an add-in for Visual Studio to expose some functionality available in the “Debugger Engine and Extension APIs”. The development is in the very early stages, but I have completed already one feature required by the Add-in, which is the  ability to read XML configuration files similarly to how .NET and the CLR do but in this case using Visual C++. This Add-in comprises native and managed code, where .NET helps me to interact with the IDE and the core functionality of the Add-in is encapsulated in a DLL. This blog entry describes the ConfigReader class.

A sample configuration file is shown below

<?xml version="1.0" encoding="utf-8"?>

<config>

    <sendOutputToVSWindow enabled="true" />

    <extensions>

        <extension name="ext" path="C:Program Files (x86)Windows Kits8.1Debuggersx86winextext.dll" />

        <extension name="wow64exts" path="C:Program Files (x86)Windows Kits8.1Debuggersx86WINXPwow64exts.dll" />

        <extension name="exts" path="C:Program Files (x86)Windows Kits8.1Debuggersx86WINXPexts.dll" />

        <extension name="uext" path="C:Program Files (x86)Windows Kits8.1Debuggersx86winextuext.dll" />

        <extension name="ntsdexts" path="C:Program Files (x86)Windows Kits8.1Debuggersx86WINXPntsdexts.dll" />

    </extensions>

</config>

The configuration file must reside in the same folder of the library, and it’s located in the constructor of the class

ConfigReader::ConfigReader() {

    ReadConfig();

}

 

void ConfigReader::ReadConfig() {

    if (!LocateConfigFile())

        throw std::exception("Config file not found. Unable to proceed.");

}

Since the DLL can be anywhere (unlike .NET) that it’s either in the GAC or the bin folder of the application, I needed to find the configuration file in the same path of the library, and this is done via enumerating the loaded modules.

BOOL ConfigReader::LocateConfigFile() {

    auto retval = FALSE;

    DWORD nModuleCount = 0;

    IXMLDOMDocumentPtr pDocPtr;

    MODULEINFO moduleDetails = {0};

    HANDLE hToken = NULL, hProcess = NULL;

    HMODULE hLoadedModules[Max_Loaded_Modules];

 

    CoInitialize(NULL);

 

    if ((hToken = GetThreadToken()) != NULL && SetPrivilege(hToken, SE_DEBUG_NAME, TRUE)) {

        if ((hProcess = OpenProcess(PROCESS_ALL_ACCESS, TRUE, GetCurrentProcessId())) != NULL) {

            if ((EnumProcessModulesEx(hProcess,  hLoadedModules, sizeof(hLoadedModules), &nModuleCount, LIST_MODULES_ALL)) != NULL) {

                auto modules = std::vector<HMODULE>(std::begin(hLoadedModules), std::end(hLoadedModules));

 

                 #ifdef _WIN64

                     nModuleCount = Item_Count(nModuleCount) / 2;

                 #else

                     nModuleCount = Item_Count(nModuleCount);

                 #endif

                 

                 auto config = DoesConfigFileExist(hProcess, modules, TargetImageName);

 

                 if (!config.empty())

                     retval = ParseConfigFile(config);

            }

            CloseHandle(hProcess);

        }

        SetPrivilege(hToken, SE_DEBUG_NAME, FALSE);

        CloseHandle(hToken);

 

        CoUninitialize();

    }

 

    return retval;

}

 

std::wstring ConfigReader::DoesConfigFileExist(const HANDLE& hProcess, std::vector<HMODULE>& hModules, const wchar_t* targetImage) {

    BOOL found = FALSE;

    std::wstring retval;

    wchar_t szDir[_MAX_DIR];

    wchar_t szExt[_MAX_EXT];

    wchar_t szBuffer[MAX_PATH];

    wchar_t szFName[_MAX_FNAME];

    wchar_t szDrive[_MAX_DRIVE];

 

    std::find_if(hModules.begin(), hModules.end(), [&, this](HMODULE hModule) {

        auto ret = FALSE;

 

        if (!found && hModule != nullptr  && (GetModuleFileNameEx(hProcess, hModule, szBuffer, Array_Size(szBuffer))) != NULL) {

            size_t cntConverted;

            char szAnsiPath[MAX_PATH];

            _wsplitpath_s(szBuffer, szDrive, Array_Size(szDrive), szDir, Array_Size(szDir), szFName, Array_Size(szFName), szExt, Array_Size(szExt));

            auto imageName = std::wstring(szFName).append(szExt);

            auto configPath = std::wstring(szDrive).append(szDir).append(ConfigFileName);

            wcstombs_s(&cntConverted, szAnsiPath, configPath.data(), configPath.size() );

            std::ifstream configFile(szAnsiPath);

 

            if (wcscmp(targetImage, imageName.data()) == 0  && configFile.good()) {

                configFile.close();

                retval.assign(configPath);

                found = TRUE;

            } 

        }

 

        return ret;

    });

 

    return retval;

}

 

HANDLE ConfigReader::GetThreadToken() {

    HANDLE retval;

    auto flags = TOKEN_ADJUST_PRIVILEGES | TOKEN_QUERY;

 

    if (!OpenThreadToken(GetCurrentThread(), flags, FALSE, &retval)) {

        if (GetLastError() == ERROR_NO_TOKEN) {

            if (ImpersonateSelf(SecurityImpersonation) &&

                !OpenThreadToken(GetCurrentThread(), flags, FALSE, &retval))

                retval = NULL;

        }

    }

    return retval;

}

 

BOOL ConfigReader::SetPrivilege(HANDLE& hToken, LPCTSTR Privilege, BOOL bEnablePrivilege) {

    LUID luid;

    auto retval = FALSE;

    TOKEN_PRIVILEGES tp = {0};

    DWORD cb = sizeof(TOKEN_PRIVILEGES);

 

    if (LookupPrivilegeValue(NULL, Privilege, &luid)) {

        tp.PrivilegeCount = 1;

        tp.Privileges[0].Luid = luid;

        tp.Privileges[0].Attributes = bEnablePrivilege ? SE_PRIVILEGE_ENABLED : 0;

        AdjustTokenPrivileges(hToken, FALSE, &tp, cb, NULL, NULL);

 

        if (GetLastError() == ERROR_SUCCESS)

            retval = TRUE;

    }

    return retval;

}

A few things worth mentioning:

  • Always use STL containers and algorithms when possible (e.g.: std::vector instead of a C/C++ type array.
  • To query information about a process, this one has to be opened and most of the times (depending on what it’s required to do) a few flags must be set.  Also remember to close any handle after it’s been used.
  • Use std::wstring (UNICODE) instead of wchar_t*. They’re safer and better to use.
  • Use lambdas in conjunction with algorithms (e.g. s td::find_if)
  • Get to learn and know the functions to convert from Multibyte (ANSI) to wide characters (UNICODE) (e.g.: wcstombs_s)
  • Pass references (and use const whenever is possible)

Once the configuration file is found, it is parsed using the MSMXL parser which it’s COM based, therefore the use of smart pointers is strongly suggested.

BOOL ConfigReader::ParseConfigFile(const std::wstring& configFile) {

    auto retval = FALSE;

    VARIANT_BOOL success;

    IXMLDOMDocumentPtr pDocPtr;

    IXMLDOMNodePtr selectedNode; 

 

    CoInitialize(NULL);

 

    pDocPtr.CreateInstance("Msxml2.DOMDocument.6.0");

    

    if (SUCCEEDED(pDocPtr->load(_variant_t(configFile.c_str()), &success))) {

        if (SUCCEEDED(pDocPtr->selectSingleNode(_bstr_t(XmlRootNode), &selectedNode))) {

            ProcessElementRecursively(selectedNode);

            retval = TRUE;

        }

    }

 

    CoUninitialize();

 

    return retval;

}

 

void ConfigReader::ProcessElementRecursively(IXMLDOMNodePtr& node) {

    long childrenCount = 0;

    IXMLDOMNodePtr childNode;

    IXMLDOMNodeListPtr children;

 

    CoInitialize(NULL);

 

    if (SUCCEEDED(node->get_childNodes(&children)) && SUCCEEDED(children->get_length(&childrenCount)) && childrenCount > 0) {

        for (auto nCount = 0; nCount < childrenCount; nCount++) {

            if (SUCCEEDED(children->get_item(nCount, &childNode))) {

                ExtractInformationFromElement(childNode);

                ProcessElementRecursively(childNode);

            }

        }

    }

    

    CoUninitialize();

}

 

void ConfigReader::ExtractInformationFromElement(IXMLDOMNodePtr& node) {

    size_t nSize;

    VARIANT value;

    std::wstring key;

    BSTR nodeContent;

    DOMNodeType nodeType;

    WCHAR szNodeText[512] = {0};

    char szBuffer[MAX_PATH] = {0};

    

 

    CoInitialize(NULL);

 

    if (SUCCEEDED(node->get_nodeType(&nodeType)) && nodeType == DOMNodeType::NODE_ELEMENT) {

        nodeContent = SysAllocString(szNodeText);

        auto pElement = (IXMLDOMElementPtr)node;

        pElement->get_tagName(&nodeContent);

 

        if (wcscmp(nodeContent, L"sendOutputToVSWindow") == 0) {

            pElement->getAttribute(_bstr_t(L"enabled"), &value);

 

            if (value.vt != VT_NULL) 

                Properties.insert(std::make_pair(nodeContent, value.bstrVal));

 

        } else if (wcscmp(nodeContent, L"extension") == 0) {

            pElement->getAttribute(_bstr_t(L"name"), &value);

 

            if (value.vt != VT_NULL)

                key.assign(value.bstrVal);

 

            pElement->getAttribute(_bstr_t(L"path"), &value);

 

            if (value.vt != VT_NULL && !key.empty()) {

                Properties.insert(std::make_pair(key.c_str(), value.bstrVal));

                wcstombs_s(&nSize, szBuffer, key.c_str(), key.size());

                std::string name(szBuffer);

                wcstombs_s(&nSize, szBuffer, value.bstrVal, wcslen(value.bstrVal));

                std::string path(szBuffer);

                m_extensions.push_back(ExtInformation(name, path));

            }

                

        }

 

        SysFreeString(nodeContent);

    }

 

    CoUninitialize();

}

Our ConfigReader object has two main fields (Extensions and Properties) – Depicted below

image

and we can also retrieve any property from the configuration file, in a similar way we do it in .NET. This is accomplished via the GetSetting method

const std::wstring ConfigReader::GetSetting(const wchar_t* key) {

    std::wstring retval;

 

    if (Properties.size() > 0 && key != nullptr && wcslen(key) > 0) {

        typedef std::pair<const std::wstring, const std::wstring> item;

 

        std::find_if(Properties.begin(), Properties.end(), [&](item i) {

            auto ret = FALSE;

 

            if (retval.size() == 0) {

                if (wcscmp(i.first.data(), key) == 0) {

                    retval.assign(i.second);

                    ret = TRUE;

                }

            }

            return ret;

        });

    }

 

 

    return retval;

}

image

Regards,

Angel

Visual Studio “4” CTP – General Availability

Hi Community,

Microsoft has announced the general availability for Visual Studio “14” CTP Today. There are quite a few interesting features, in both native and managed languages.

Please find below some resources about this release:

Regards,

Angel