Sunday, August 26, 2012

Developing C++ shared libraries with Eclipse CDT

Con­sider the fol­low­ing not so uncom­mon sce­nario: you have a shared library project in Eclipse and an exe­cutable project which uses the shared library.

I found that you have to do some addi­tional things for build­ing, run­ning and debug­ging the exe­cutable project in Eclipse (note that I’m doing this on Linux using CDT 4.0.3):

Define your shared library’s project as a ref­er­ence for your appli­ca­tion. In order to do so open the project prop­er­ties of the exe­cutable project, go to project ref­er­ences and select the shared library project (where in my case “shared” is also the name of the shared library).

In the project set­tings of your exe­cutable project, add the shared library to the linker set­tings, as below:

  1. Select the project name and right click, then choose property,
  2. Select C/C++ Build > C/C++ Linker,
  3. Add the library name of your shared library,
  4. Specify the shared library path "${workspace_loc:/SharedLib/Debug}"

Reference:

http://dirkraffel.com/2008/06/27/developing-shared-libraries-with-eclipse-cdt/

Saturday, August 25, 2012

CppUnit with Eclipse CDT Tutorial

About Eclipse CDT

Eclipse C/C++ Development Toolkit (CDT) is an extension to the Eclipse platform in the form of a plug-in. This plug-in is available for download for all platforms. The open source nature of the plug-in with its user-friendliness makes it more popular not just among the Linux developers but also among C++ developers on other platforms. CDT and the Web Tools plug-ins are the two most popular Eclipse plug-ins. Nearly two out of three developers using CDT are Windows users.

CDT has subcomponents or plug-ins that are independent projects in the CDT community. The most important is the CDT primary plug-in, which provides the core CDT capabilities. CDT Debug UI provides the UI capabilities for the debug editors and views. CDT UI plug-in provides the UI-related features, views, editors, wizards, etc. CDT Debug provides core debugging capabilities. CDT Feature provides CDT Feature component. CDT core presents Core Model, CDOM, and other core components. CDT Launch provides launch mechanism for launching external executables and tools. CDT Debug MI is the application connector for MI-compatible debuggers.

CDT editors have several features that make them popular. For example, syntax highlighting and code assist make software development quick and easy. Syntax highlighting is configurable and can be personalized to your individual taste. Code assist is the code completion feature that is similar to the one in Visual Studio. Custom-defined code templates can be added to the plug-in, which can be used by code assist.
In the following sections, we will learn how to use CDT effectively to use for testing project with CppUnit.

Installing Eclipse CDT

Download http://eclipse.org/downloads/ - i.e. eclipse-cpp-XXX-linux-gtk-x86_64.tar.gz 
Eclipse also requires the Java installation.

1. Download the Cpp``Unit distribution file Cpp``Unit-1.10.2.tar.gz and unpack it in a directory of your choice. From now on we will refer to this directory as $Cpp``Unit .
2. Run MSys to open a command line shell, then cd to the directory where you extracted Cpp``Unit and do:
./configure

There's no need to go any further than this with make, make install, etc. All we will need is the header $Cpp``Unit/include/Cpp``Unit/config-auto.h which has been generated in the step above.
Go to Eclipse and create a Managed Make C++ Project to generate a library (either static or shared).
Import all .h and .cpp files in $Cpp``Unit/src/Cpp``Unit to the project.
Open the project properties configuration dialog (Alt + Enter) and add $Cpp``Unit/include to the include path.


Thursday, August 23, 2012

Creating a local yum repository in CentOS


Sometimes it can be handy to set up your own repository to prevent from downloading the remote repository over and over again. This tutorial shows how to create a CentOS mirror for your local network. If you have to install multiple systems in your local network then all needed packages can be downloaded over the fast LAN connection, thus saving your internet bandwidth.

Create the Directories

mkdir -pv /var/www/html/centos/VER/{os,updates}/i386

Replacing VER and i386 with your major version and architecture.
Additionally you'll need some deeper directories. This is the correct location to copy the CD/DVD rpms.

Quote:
mkdir -pv /var/www/html/centos/VER/os/i386/CentOS/RPMS/

The Base Repository
Copy the RPMs from the CDs/DVD to /var/www/html/centos/base.

Create the base repository headers:

createrepo /var/www/html/centos/VER/os/i386/

The Updates Repository

Select an rsync mirror for updates: check out this list of aviable mirrors: Centos OS Mirror list and these are identified with rsync.

For example: rsync://ftp.belnet.be/packages/centos/

The mirrors share a common structure for updates. Simply append /updates/<d version>/<base arch>.

Rsync to create the updates-released repository:

/usr/bin/rsync -avrt rsync://ftp.riken.jp/centos/4/updates/i386 --exclude=debug/ /var/www/html/centos/4/updates/

This will create a complete update repository at /var/www/html/centos/4/updates/i386. The repodata directory will be created with all of the headers.

You can additionally pipe this into mail to receive an email when updates are available.

/usr/bin/rsync -avrt rsync://ftp.riken.jp/centos/4/updates/i386 --exclude=debug /var/www/html/centos/4/updates/ | /bin/mail you@example.com -s "New Repo Updates"

Next I would advise to setup a cron job to run the rsync (above). In this manner your repository is kept updated and only new updates and headers will be downloaded to your repository.

Client Site Configuration

Yum Configuration

Edit yum.conf:
vi /etc/yum.repos.d/CentOS-Base.repo

http://localhost/res/centos/

[base]
name=CentOS-$releasever - Base
baseurl=http://192.168.*.*/centos/$releasever/os/$basearch/
#mirrorlist=http://mirrorlist.centos.org/?release=$releasever&arch=$basearch&repo=os
#baseurl=http://mirror.centos.org/centos/$releasever/os/$basearch/
gpgcheck=1
gpgkey=http://mirror.centos.org/centos/RPM-GPG-KEY-CentOS-5
#released updates
[update]
name=CentOS-$releasever - Updates
baseurl=http://192.168.*.*/centos/$releasever/updates/$basearch/
#mirrorlist=http://mirrorlist.centos.org/?release=$releasever&arch=$basearch&repo=updates
#baseurl=http://mirror.centos.org/centos/$releasever/updates/$basearch/
gpgcheck=1
gpgkey=http://mirror.centos.org/centos/RPM-GPG-KEY-CentOS-5

Sunday, August 12, 2012

C++ Thread Using Boost library sample

This post is the sample of essentials C++ thread using boost::threads library

#include <iostream>
#include <sstream>
#include <vector>
#include <string>
#include <boost/thread.hpp>
#include <boost/date_time.hpp>
#include <boost/bind.hpp>
#include "Elapsed.hpp"

const std::string RowRowRow("Row row row your boat gently down the stream");
const std::string Teapot("I'm a little teapot short and stout");

void sing(const std::string& lyrics,boost::posix_time::time_duration interval,bool indent=false) {
std::istringstream iss;
iss.str(lyrics);
std::string current;
do {
iss >> current;
if (iss) {
// extra spaces make it easier to read when interleaved by threading
if (indent)
std::cout << "\t\t";
std::cout  << current << "\n";
boost::this_thread::sleep( interval );
} // end if
} while ( !iss.bad() && !iss.eof() );
std::cout << "\n";
} // end sing()


class Singer {
std::string m_lyrics;
boost::posix_time::time_duration m_interval;
bool m_indent;
public:
Singer(const std::string& lyrics,boost::posix_time::time_duration interval,bool indent)
:
m_lyrics(lyrics),
m_interval(interval),
m_indent(indent)
{

} // end constructor
void perform() {
sing(m_lyrics,m_interval,m_indent);
} // end perform()
}; // end class Singer


int main(int argc,char* argv[]) {
using namespace boost::posix_time;


time_duration interval( milliseconds(250) );
auto delay( milliseconds(60) );

// "sing" with a function
sing( RowRowRow, interval );


// delay
boost::this_thread::sleep( delay );

// "sing" with a member function
Singer teapotSinger(Teapot,interval,true);
teapotSinger.perform();


return 0;
} // end main()

Reference

http://www.advancedcplusplus.com/5min-threads/

Saturday, August 4, 2012

Thread samples

C++11 Standard thread sample using std::thread


#include <iostream>
#include <thread>

void thFun(int i) {
  std::cout << "Worker " << i << "!\n";
}

int main() {
  // Create a thread
  std::thread th(&thFun); // pass fun to thread constructor
  std::cout << "Main Thread!\n";
 
  // Need to wait until the worker thread finish the job
  // This by calling join();
  th.join();
 
  return 0;
}

Multiple forks


// Sample of Thread multiple forks
#include <iostream>
#include <thread>
#include <algorithm> // for_each
#include <cassert>

void thFun(int i) {
  std::cout << "Worker " << i << "!\n";
}

int main() {
  // Create store to store the created threads
  std::vector<std::thread> workers;
 
  for (int i = 0; i < 10; ++i) {
    auto th = std::thread(&thfun, i);
    // Every thread push back into the store of the stack
    workers.push_back(std::move(th));
    assert(!th.joinable());
  }

  std::cout << "Main Thread!\n";
  std::for_each(workers.begin(), workers.end(), [](std::thread & th)) {
    assert(th.joinable());
    th.join();
  });
  return 0;
}

Makefiles

Makefiles
Make searches the current directory for the makefile to use, e.g. GNU make searches files in order for a file named one of GNUmakefile, makefile, Makefile and then runs the specified (or default) target(s) from (only) that file.

The makefile language is similar to declarative programming. This class of language, in which necessary end conditions are described but the order in which actions are to be taken is not important, is sometimes confusing to programmers used to imperative programming.
One problem in build automation is the tailoring of a build process to a given platform. For instance, the compiler used on one platform might not accept the same options as the one used on another. This is not well handled by Make. This problem is typically handled by generating platform specific build instructions, which in turn are processed by Make. Common tools for this process are Autoconf and CMake.

Rules
A makefile consists of rules. Each rule begins with a textual dependency line which defines a target followed by a colon (:) and optionally an enumeration of components (files or other targets) on which the target depends. The dependency line is arranged so that the target (left hand of the colon) depends on components (right hand of the colon). It is common to refer to components as prerequisites of the target.

For example, a C .o object file is created from .c files, so you need to have .c files first (i.e. specific object file target depends on a C source file and header files). Because Make itself does not understand, recognize or distinguish different kinds of files, this opens up a possibility for human error. A forgotten or an extra dependency may not be immediately obvious and may result in subtle bugs in the generated software. It is possible to write makefiles which generate these dependencies by calling third-party tools, and some makefile generators, such as the Automake toolchain provided by the GNU Project, can do so automatically.

After each dependency line, a series of command lines may follow which define how to transform the components (usually source files) into the target (usually the "output"). If any of the components have been modified, the command lines are run.

Make can decide where to start through topological sorting.
Each command line must begin with a tab character to be recognized as a command. The tab is a whitespace character, but the space character does not have the same special meaning. This is problematic, since there may be no visual difference between a tab and a series of space characters. This aspect of the syntax of makefiles is often subject to criticism.

Each command is executed by a separate shell or command-line interpreter instance. Since operating systems use different command-line interpreters this can lead to unportable makefiles. For instance, GNU Make by default executes commands with /bin/sh, where Unix commands like cp are normally used. In contrast to that, Microsoft's nmake executes commands with cmd.exe where batch commands like copy are available but not necessarily cp.

    target [target ...]: [component ...]
    [<TAB>command 1]
           .
           .
           .
    [<TAB>command n]

Usually each rule has a single unique target, rather than multiple targets.
A rule may have no command lines defined. The dependency line can consist solely of components that refer to targets, for example:

    realclean: clean distclean

The command lines of a rule are usually arranged so that they generate the target. An example: if "file.html" is newer, it is converted to text. The contents of the makefile:

    file.txt: file.html
            lynx -dump file.html > file.txt

The above rule would be triggered when Make updates "file.txt". In the following invocation, Make would typically use this rule to update the "file.txt" target if "file.html" were newer.

    make file.txt

Command lines can have one or more of the following three prefixes:
a hyphen-minus (-), specifying that errors are ignored
an at sign (@), specifying that the command is not printed to standard output before it is executed
a plus sign (+), the command is executed even if Make is invoked in a "do not execute" mode
Ignoring errors and silencing echo can alternatively be obtained via the special targets ".IGNORE" and ".SILENT".[8]
Microsoft's NMAKE has predefined rules that can be omitted from these makefiles, e.g. "c.obj $(CC)$(CFLAGS)".

Macros
A makefile can contain definitions of macros. Macros are usually referred to as variables when they hold simple string definitions, like "CC=gcc". Macros in makefiles may be overridden in the command-line arguments passed to the Make utility. Environment variables are also available as macros.
Macros allow users to specify the programs invoked and other custom behavior during the build process. For example, the macro "CC" is frequently used in makefiles to refer to the location of a C compiler, and the user may wish to specify a particular compiler to use.
New macros (or simple "variables") are traditionally defined using capital letters:

    MACRO = definition

A macro is used by expanding it. Traditionally this is done by enclosing its name inside $(). A rarely used but equivalent form uses curly braces rather than parenthesis, i.e. ${}.
    NEW_MACRO = $(MACRO)-$(MACRO2)
Macros can be composed of shell commands by using the command substitution operator, denoted by backticks (`).
    YYYYMMDD  = ` date `
The content of the definition is stored "as is". Lazy evaluation is used, meaning that macros are normally expanded only when their expansions are actually required, such as when used in the command lines of a rule. An extended example:
    PACKAGE   = package
    VERSION   = ` date +"%Y.%m%d" `
    ARCHIVE   = $(PACKAGE)-$(VERSION)

    dist:
            #  Notice that only now macros are expanded for shell to interpret:
            #      tar -cf package-`date +"%Y%m%d"`.tar

            tar -zcf $(ARCHIVE).tar .
The generic syntax for overriding macros on the command line is:
    make MACRO="value" [MACRO="value" ...] TARGET [TARGET ...]
Makefiles can access any of a number of predefined internal macros, with '?' and '@' being the most common.
    target: component1 component2
            echo $? contains those components, which need attention (i.e. they ARE YOUNGER than current TARGET).
            echo $@ evaluates to current TARGET name from among those left of the colon.

Suffix rules
Suffix rules have "targets" with names in the form .FROM.TO and are used to launch actions based on file extension. In the command lines of suffix rules, POSIX specifies[9] that the internal macro $< refers to the prerequisite and $@ refers to the target. In this example, which converts any HTML file into text, the shell redirection token > is part of the command line whereas $< is a macro referring to the HTML file:

    .SUFFIXES: .txt .html

    # From .html to .txt
    .html.txt:
            lynx -dump $<   >   $@

When called from the command line, the above example expands.

    $ make -n file.txt
    lynx -dump file.html > file.txt

Other elements
Single-line comments are started with the hash symbol (#).
Some directives in makefiles can include other makefiles.
Line continuation is indicated with a backslash \ character at the end of a line.

    target: component \
            component
    <TAB>command ;          \
    <TAB>command |          \
    <TAB>piped-command

Example makefiles
Makefiles are traditionally used for compiling code (*.c, *.cc, *.C, etc.), but they can also be used for providing commands to automate common tasks. One such makefile is called from the command line:

    make                        # Without argument runs first TARGET
    make help                   # Show available TARGETS
    make dist                   # Make a release archive from current dir

The makefile:

    PACKAGE      = package
    VERSION      = ` date "+%Y.%m%d%" `
    RELEASE_DIR  = ..
    RELEASE_FILE = $(PACKAGE)-$(VERSION)

    # Notice that the variable LOGNAME comes from the environment in
    # POSIX shells.
    #
    # target: all - Default target. Does nothing.
    all:
            echo "Hello $(LOGNAME), nothing to do by default"
            # very rarely: echo "Hello ${LOGNAME}, nothing to do by default"
            echo "Try 'make help'"

    # target: help - Display callable targets.
    help:
            egrep "^# target:" [Mm]akefile

    # target: list - List source files
    list:
            # Won't work. Each command is in separate shell
            cd src
            ls

            # Correct, continuation of the same shell
            cd src; \
            ls

    # target: dist - Make a release.
    dist:
            tar -cf  $(RELEASE_DIR)/$(RELEASE_FILE) && \
            gzip -9  $(RELEASE_DIR)/$(RELEASE_FILE).tar


Below is a very simple makefile that by default (the "all" rule is listed first) compiles a source file called "helloworld.c" using the gcc C compiler and also provides a "clean" target to remove the generated files if the user desires to start over. The $@ and $< are two of the so-called internal macros (also known as automatic variables) and stand for the target name and "implicit" source, respectively. In the example below, $^ expands to a space delimited list of the prerequisites. There are a number of other internal macros.

    CC     = gcc
    CFLAGS = -g

    all: helloworld

    helloworld: helloworld.o
            # Commands start with TAB not spaces
            $(CC) $(LDFLAGS) -o $@ $^

    helloworld.o: helloworld.c
            $(CC) $(CFLAGS) -c -o $@ $<

    clean: FRC
            rm -f helloworld helloworld.o

    # This pseudo target causes all targets that depend on FRC
    # to be remade even in case a file with the name of the target exists.
    # This works with any make implementation under the assumption that
    # there is no file FRC in the current directory.
    FRC:


Many systems come with predefined Make rules and macros to specify common tasks such as compilation based on file suffix. This allows user to omit the actual (often unportable) instructions of how to generate the target from the source(s). On such a system the above makefile could be modified as follows:

    all: helloworld

    helloworld: helloworld.o
        $(CC) $(CFLAGS) $(LDFLAGS) -o $@ $^

    clean: FRC
        rm -f helloworld helloworld.o

    # This is an explicit suffix rule. It may be omitted on systems
    # that handle simple rules like this automatically.
    .c.o:
        $(CC) $(CFLAGS) -c $<

    FRC:
    .SUFFIXES: .c

That "helloworld.o" depends on "helloworld.c" is now automatically handled by Make. In such a simple example as the one illustrated here this hardly matters, but the real power of suffix rules becomes evident when the number of source files in a software project starts to grow. One only has to write a rule for the linking step and declare the object files as prerequisites. Make will then implicitly determine how to make all the object files and look for changes in all the source files.
Simple suffix rules work well as long as the source files do not depend on each other and on other files such as header files. Another route to simplify the build process is to use so-called pattern matching rules that can be combined with compiler-assisted dependency generation. As a final example requiring the gcc compiler and GNU Make, here is a generic makefile that compiles all C files in a folder to the corresponding object files and then links them to the final executable. Before compilation takes place, dependencies are gathered in makefile-friendly format into a hidden file ".depend" that is then included to the makefile.

    # Generic GNUMakefile

    # Just a snippet to stop executing under other make(1) commands
    # that won't understand these lines
    ifneq (,)
    This makefile requires GNU Make.
    endif

    PROGRAM = foo
    C_FILES := $(wildcard *.c)
    OBJS := $(patsubst %.c, %.o, $(C_FILES))
    CC = cc
    CFLAGS = -Wall -pedantic
    LDFLAGS =

    all: $(PROGRAM)

    $(PROGRAM): .depend $(OBJS)
        $(CC) $(CFLAGS) $(OBJS) $(LDFLAGS) -o $(PROGRAM)

    depend: .depend

    .depend: cmd = gcc -MM -MF depend $(var); cat depend >> .depend;
    .depend:
        @echo "Generating dependencies..."
        @$(foreach var, $(C_FILES), $(cmd))
        @rm -f depend

    -include .depend

    # These are the pattern matching rules. In addition to the automatic
    # variables used here, the variable $* that matches whatever % stands for
    # can be useful in special cases.
    %.o: %.c
        $(CC) $(CFLAGS) -c $< -o $@

    %: %.c
        $(CC) $(CFLAGS) -o $@ $<

    clean:
        rm -f .depend *.o

    .PHONY: clean depend

Make

In software development, Make is a utility that automatically builds executable programs and libraries from source code by reading files called makefiles which specify how to derive the target program. Though integrated development environments and language-specific compiler features can also be used to manage a build process, Make remains widely used, especially in Unix.

Behavior

Make is typically used to build executable programs and libraries from source code. Generally though, any process that involves transforming a source file to a target result (by executing arbitrary commands) is applicable to Make. For example, Make could be used to detect a change made to an image file (the source) and the transformation actions might be to convert the file to some specific format, copy the result into a content management system, and then send e-mail to a predefined set of users that the above actions were performed.
Make is invoked with a list of target file names to build as command-line arguments:
    make TARGET [TARGET ...]

Makefiles

Make searches the current directory for the makefile to use, e.g. GNU make searches files in order for a file named one of GNUmakefile, makefile, Makefile and then runs the specified (or default) target(s) from (only) that file.

The makefile language is similar to declarative programming. This class of language, in which necessary end conditions are described but the order in which actions are to be taken is not important, is sometimes confusing to programmers used to imperative programming.

References

http://www.openbsd.org/cgi-bin/man.cgi?query=make#FILES
http://en.wikipedia.org/wiki/O%27Reilly_Media

C++11 eclipse MinGW configuration

This post is the memo of how to configure eclipse C/C++ on Windows using MinGW for C++11 standard.


  • Make a new C++ project
  • Default options for everything
  • Once created, right-click the project and go to "Properties"
  • C/C++ Build -> Settings -> Tool Settings -> GCC C++ Compiler -> Miscellaneous -> Other Flags. Put-std=c++0x at the end . ... instead of GCC C++ Compiler I have also Cygwin compiler
  • C/C++ General -> Paths and Symbols -> Symbols -> GNU C++. Click "Add..." and paste__GXX_EXPERIMENTAL_CXX0X__ (ensure to append and prepend two underscores) into "Name" and leave "Value" blank.
  • Hit Apply, do whatever it asks you to do, then hit OK.

Reference

Thursday, August 2, 2012

Convert legacy of C++ project using Makefile to Eclipse IDE

Suppose you have a legacy of C++ project on Linux which uses the typical:
./configure
make
make install

to build and install.

If you would really like to build it instead with an IDE like Eclipse, how to achieve this?

Using Eclipse with the CDT plugin will allow you to use it for C/C++ projects, and you can tell it to use Makefiles to build your project. You just have to set up a Makefile project. You might have to tell it to let you manage the Makefiles rather than have it do it - I don't remember off the top of my head - but there should be no problem in setting up Eclipse to use pre-existing Makefiles to build a pre-existing project. I've done it before.

You will have to tell it where the include directories are and what macros to assume are defined for things like code completion to work correctly (I don't know of any way for Eclipse to figure that out for you), so there is definitely some set up that you'll have to do. But it definitely works.

Just grab the C++ version of Eclipse from their site (it comes with all of the appropriate C/C++ plugins so that you don't have to track them down), and you can look at the CDT site for documentation, frequently asked questions, etc.

References

http://www.ibm.com/developerworks/aix/library/au-unix-eclipse/index.html
http://help.eclipse.org/galileo/index.jsp?topic=/org.eclipse.cdt.doc.user/getting_started/cdt_w_existing_code.htm

Wednesday, August 1, 2012

C++11

C++11 (formerly known as C++0x) is the most recent iteration of the C++ programming language. It was approved by ISO on 12 August 2011, replacing C++03. The name is derived from the tradition of naming language versions by the year of the specification's publication.
C++11 includes several additions to the core language and extends the C++ standard library, incorporating most of the C++ Technical Report 1 (TR1) libraries — with the exception of the library of mathematical special functions. C++11 was published as ISO/IEC 14882:2011in September 2011 and is available for a fee. The working draft most similar to the published C++11 standard is N3337, dated 12 January 2012; it has only editorial corrections from the C++11 standard.

Reference

  1. Perfect online video tutorial of C++11 Concurrency
    http://www.youtube.com/watch?v=80ifzK3b8QQ