Android Smart Girls — Finishing Line !

Yesterday, the project Android Smart Girls crossed the finishing line, with an amazing prize award ceremony. As you might remember, this was the pilot for an extra-curricular computer programming activity for high school girls. On its first phase, the girls had classes on the MIT App Inventor ; on the second phase, they proposed and implemented their own apps with the help of mentors.

The project was an initiative of Prof. Juliana Borin (Institute of Computing / UNICAMP), the girls from IEEE Women in Engineering South Brazil, and IEEE WIE founder (now at SAMSUNG Research Brazil) Dr. Vanessa Testoni, in cooperation with Hilton Federici State High School, at Campinas, and many, many, many wonderful, generous volunteers. The project was supported by SAMSUNG and by a grant from CNPq.

The project leaders, project contributors, and I are working to document the initiative into an open courseware that make possible to reproduce it in other schools throughout Brazil. The project leaders and I also want to ensure that all the many contributors to the project — from Hilton Federici High, from IEEE WIE, from UNICAMP, from SAMSUNG — get their work acknowledged.

Stay tuned !

This slideshow requires JavaScript.

Dr. Sandra Avila, mentor of the winning team, is my former Ph.D. student, and current postdoc. Ms. Nadja Ramos, the other mentor, is doing her capstone undergraduate project under my supervision (her capstone project, incidentally, is directly related to the Smart Girls initiative). Needless to say, I was proud as a peacock.

Back in the developer’s saddle in Yosemite ; Installing Maven on OS X

When Mavericks launched, I scheduled doing a clean reinstall over a blank, reformatted HD. (Due to the degradation of configurations, permissions, and other metadata, a system may suffer to something akin to a long-term aging effect. A reinstallation from scratch is a way to freshen it up.) The task, however, was marked “low priority” in my To Do list. The result : last week I was forced to upgrade to Yosemite, and still no reformatting.

As I explained in that post, I’ve noticed a trend of CS/IT professionals being the most reluctant users in updating to the latest hardware or software. Yosemite justified that reluctance, by breaking my HomeBrew installation. The reason : HomeBrew explicitly links to Ruby 1.8, which is obliterated by Yosemite in favor of Ruby 2.0. (Hey, Apple, word of advice : it’s no use having a sophisticated system of coexisting Framework versions if you decide on a whim to delete the older versions.)

I had experienced some  minor inconveniences before I encountered this problem. In the text that follows, I assume that you have already dealt with the following :

  1. Updating Xcode on App Store (Menu Apple … App Store…; tab Updates) ;
  2. Re-accepting the terms and conditions of Xcode : neither Xcode nor its command-line tools will run before will sell your soul to Apple again. And even if you have administrator permissions, you have to sudo a command-line tool to be able to do it. You’ll see an ugly message like : “Agreeing to the Xcode/iOS license requires admin privileges, please re-run as root via sudo.” Either re-execute the command with sudo (e.g., sudo make), or accept the agreement in the Xcode graphical app ;
  3. (Possibly ?) Reinstalling Java VM from Oracle. This might just be an issue for web browsing ; maybe the VM works on the command-line out of the box : I didn’t check it. But if you type java on Terminal and nothing happens, chances are you’ll need to get it before being able to do anything interesting.

The bad news : the only way I could get HomeBrew back to work was reinstalling Ruby 1.8.

The good news (if you have a Time Machine) : doing it is a breeze. Just restore the folder 


to its rightful place.

If you don’t have a Time Machine (how do you even survive on OS X without one ?!), maybe you have an old MacBook stored in a cupboard ? Or an upgrade-averse friend who has not yet moved to Yosemite ? (Hint : do you know anyone who works on CS/IT ?) Get a copy of that folder and put it back where it belongs.

If you can’t get your hands on that folder anywhere, you’re probably out of luck. You might be able to fish the framework out of the installer packages of an older OS X version, but just thinking about it makes me want to cry. Maybe you can wait for HomeBrew to issue a patch ?

With Ruby 1.8 back in place, things become very straightforward. Just to be sure, run the following commands :

brew update

brew doctor

And check if there are any remaining issues to be solved. (By the by, you don’t have to try and solve every minor problem : in computing as in medicine, minimal intervention is often wise.)

All this marathon started when I needed to install Maven on my system.

With HomeBrew working, this takes an one-liner :

brew install maven

The installation worked without issues, but for some reason, Maven kept complaining that the JAVA_HOME environment variable was broken:

Error: JAVA_HOME is not defined correctly.
  We cannot execute /usr/libexec/java_home/bin/java

Naïvely setting JAVA_HOME to /usr let Maven run, but with an irritating warning :

Unable to find a $JAVA_HOME at '/usr', continuing with system-provided Java...

What solved the problem completely was adding this line to ~/.bash_profile :

export JAVA_HOME=`/usr/libexec/java_home`

“Upgrade to Yosemite,” they said.
“It will be fun,”, they said.

…but if you experience the same problem, you’ll first need to check where the java_home util (it prints the path of the Java VM on stdout) actually is in your system. If /usr/libexec/java_home runs, the solution above will probably work.

Upgrade cascade : iPhone, Yosemite, iPhoto, iMovie

I’ve noticed a consistent trend of my colleagues and I, Computer Sciences / Engineering faculty, being way less eager than the general public in updating to the latest hardware or software. There is, maybe, a component of the shoemaker’s son going barefoot, but most importantly — I suspect — it’s the knowledge on sausage-making impairing our appetites. When you know the reality of system design intimately, you become very reluctant in disturbing whatever metastability you might have reached.

But all systems have a service life, and eventually even the most reluctant user will be forced to upgrade. After skipping 2 generations, I thought it was time to abandon my iPhone 4S for a new iPhone 6.

(Which was an adventure in itself : amazingly, after almost 2 months, there are still queues for buying an iPhone on the States. So far, ok — supply and demand, etc. — but for some unfathomable reason, Apple has instructed their clerks to outright lie about the non-contract T-Mobile iPhone, in saying that it is not unlocked.  After some googling and whatsapping with friends, the truth emerged : it is unlocked. Still, at the first Apple Store I tried, the clerks where very non-cooperative, and one of them positively adversarial, like he’d rather not sell anything to me. I am really not the type of person to buy into this “privilege to be a customer” attitude, so I just went to another store. Long story short : two days and 830 bucks later, I had an iPhone 6 in my pocket. It is indeed unlocked, I had it working with my Vivo telecom nano-SIM immediately, still inside the store.)

But as often it happens, one upgrade leads to another in cascade effect : the iPhone rejected my old iTunes, forcing me to upgrade old faithful Mountain Lion to Yosemite.

Update Unavailable with This Apple IDAs if to confirm that upgrading is a messy business, Yosemite got me a great welcoming surprise : it disabled my old iPhoto (“incompatible with new OS version, must be updated”), and made it impossible for me to update it (“Update Unavailable with This Apple ID”). For some strange reason, the App Store utility insisted on that message, no matter which Apple ID I used (I only have two).

Apparently this is not a rare situation, and the causes and solutions are exasperatingly diverse. What solved the problem in my case, was closing the App Store, deleting iPhoto altogether (dragging the disabled application to the trash), opening the App Store again, and doing a fresh install. The procedure itself is not very painful, I concede : the annoyance is having to find out what exactly to do.

For upgrading iMovie, the solution was not so simple. It is not a mandatory upgrade (the Mountain Lion version still works with Yosemite), but since I had gone so far, I now wanted to go all the way. Deleting iMovie made available a fresh install on App Store… for 15 bucks. No good. I’ve tried, as some suggested by some users, reinstalling the original (from the Snow Leopard CDs in my case), but to no avail. In the end, I just moved the old Mountain Lion iMovie from the trash back to the Applications folder.

Curiously, XCode, which is normally a trouble-maker, updated without further ado.

Edit 19/11 : upgrading to Yosemite 10.10.1 solved the iMovie Apple ID issue. I’m guessing it would have solved the iPhoto issue as well. This is another golden rule of upgrading — never move to the version with a round number, always wait for the next minor patch.

Paper at SISAP’2014 on large-scale LSH for general metric data

We’ve got a paper — Large-Scale Distributed Locality-Sensitive Hashing for General Metric Data — accepted at the upcoming International Conference on Similarity Search and Applications (SISAP’2014).

I’ll be presenting the paper next week : if you’re planning to be there, please, come to say hi ! My session will be on Wednesday afternoon (October 30th at 14h).

The paper is part of an ongoing cooperation with my colleague Prof. George Teodoro (University of Brasilia), with my former M.Sc. student Eliezer Silva, and Petrobras researcher Thiago Teixeira. Here’s the abstract :

Locality-Sensitive Hashing (LSH) is extremely competitive for similarity search, but works under the assumption of uniform access cost to the data, and for just a handful of dissimilarities for which locality-sensitive families are available. In this work we propose Parallel Voronoi LSH, an approach that addresses those two limitations of LSH: it makes LSH efficient for distributed- memory architectures, and it works for very general dissimilarities (in particular, it works for all metric dissimilarities). Each hash table of Voronoi LSH works by selecting a sample of the dataset to be used as seeds of a Voronoi diagram. The Voronoi cells are then used to hash the data. Because Voronoi diagrams depend only on the distance, the technique is very general. Implementing LSH in distributed-memory systems is very challenging because it lacks referential locality in its access to the data: if care is not taken, excessive message-passing ruins the index performance. Therefore, another important contribution of this work is the parallel design needed to allow the scalability of the index, which we evaluate in a dataset of a thousand million multimedia features.

The fullpaper is available at the conference proceedings (LNCS 8821) and will be on open access from October 20 to November 21, 2014. The last preprint is also available on my publications page.

We can’t tell you just yet…

(This entry is cross-posted from my lab’s blog.)

Anyone who’s ever worked in the frontier between Science and Innovation has faced the dilemma of secrecy versus disclosure : the scientific spirit demands full publication of every implementation detail — a result that cannot be reproduced is not a result — but when you are seeking intellectual property rights, you are often forced to withhold some details until you’ve got that patent.

We have faced that quandary during our participation in MediaEval’s Violence Detection task : the Scientist in us wanted to just tell everything. But the research project that resulted in our participation in that competition is not just a scientific project, it is also about innovation, in partnership with Samsung Research Institute Brazil. As such, some details had to remain concealed, much to the frustration of everyone’s curiosity.

Fortunately, the task organizers took it in stride :


…that good-natured ribbing got everyone laughing at the task closing meeting !

We are sorry for the teasing, guys. We promise we will tell you everything soon… just not yet.

(Kudos to Mats and Martha for their good humor !)

Associate director of undergraduate studies

For the next few months I’ll be occupying the position of associate director of undergraduate studies of the Computer Engineering course, left by Prof. Ivan Ricarte, who got his full professorship at another academic unit of UNICAMP. Currently, the director is Prof. Helio Pedrini of the Institute of Computing. Prof. Akebo Yamakami has kindly accepted to be my “vice-associate”, an informal position that exists due to the direction being shared between two academic units. This is good news, because I’m a rookie in what concerns academic administration, while Prof.  Yamakami has been involved in undergraduate studies direction since… forever. His experience will be inestimable.

I was appointed by the Electrical and Computer Engineering School steering committee in an indirect election, for a provisional mandate. Next June, the entire electoral college (faculty, staff and students) will vote for the next director here at FEEC, and for the next associate director at Institute of Computing, since the positions switch between the two units at the end of the mandates.  (I know, I know — it’s complicated — but you get used to the idiosyncrasies of Brazilian public administration after a while…)

I thank my colleagues of the steering committee for their trust.

Performance (recall vs time) of a few LSH techniques for general metric spaces

Talk at DCC, Universidad de Chile on Locality-Sensitive Hashing

My colleague Prof. Benjamin Bustos was kind enough to invite me for two weeks to collaborate with him and his students. In the context of that cooperation, I’ll be giving a talk at the Department of Computer Sciences, Universidad de Chile on recent advances of Locality-Sensitive Hashing (LSH), “Advances on Locality-Sensitive Hashing for Large-Scale Indexing on General Metric Spaces”. Among other things, I’ll be talking on recent works of my group on the topic.

Here’s the abstract :

Performance (recall vs time) of a few  LSH techniques for general metric spaces

Locality-senstive hashing, (LSH) initially available only for Hamming, Jacquard, Manhattan and Euclidean spaces, is now competitive for general metric spaces.

Locality-Sensitive Hashing is a family of techniques for similarity search that gained much attention in the literature both for its beautiful formalism and for its ability to perform well in systems where the cost of access to the data is uniform. However, traditional LSH poses the challenge of deducing a completely new family of locality-sensitive hashing functions, which is unique for each distance function. Recently, researchers have proposes works that greatly extend the applicability of LSH, both by creating locality-sensitive functions that work for generic metric spaces, and by redesigning the algorithm to work in distributed-memory systems, whose cost of access to the data is not uniform (NUMA). In this talk, I’ll introduce LSH formalism, and then focus on those recent advances.

The talk will be given in English, while the discussions will be in both English and Spanish.

When : Thursday, September 11, at 14h00 15h00.

Where : Departamento de Ciencias de la Computación, Universidad de Chile. Av. Beauchef, 851, Santiago, Chile, 837-0456.

Paper on Automated Melanoma Screening Accepted at SIBGRAPI

Our research on automated screening for melanoma was accepted for SIBGRAPI’2014, the Brazilian conference on Graphics, Patterns and Images, to be held in Rio de Janeiro next month.

Melanoma is the most dangerous skin cancer type, being responsible for the majority of deaths due to skin diseases. It is, on the other hand, one of the most curable forms of cancer when it is detected early enough. Because the prevalence of melanoma is increasing throughout the world, tools for the automated screening — a test for wether or not the patient should seek a dermatologist — are a public health necessity. Automated screening is particularly important in poor, rural, or isolated communities, with no resident dermatologist.

Extracts of skin lesions. Melanoma (left column) and benign skin lesions (right column) appear very similar, making the task of automated screening very challenging.

Extracts of skin lesions. Melanoma (left column) and benign skin lesions (right column) appear very similar, making the task of automated screening very challenging.

The paper, “Statistical Learning Approach for Robust Melanoma Screening”, advances the state of the art by employing a cutting-edge extension to the bags-of-words model called BossaNova. Here’s the abstract :

According to the American Cancer Society, one person dies of melanoma every 57 minutes, although it is the most curable type of cancer if detected early. Thus, computer-aided diagnosis for melanoma screening has been a topic of active research. Much of the existing art is based on the Bag-of-Visual-Words (BoVW) model, combined with color and texture descriptors. However, recent advances in the BoVW model, as well as the evaluation of the importance of the many different factors affecting the BoVW model were yet to be explored, thus motivating our work. We show that a new approach for melanoma screening, based upon the state-of-the-art BossaNova descriptors, shows very promising results for screening, reaching an AUC of up to 93.7%. An important contribution of this work is an evaluation of the factors that affect the performance of the two-layered BoVW model. Our results show that the low-level layer has a major impact on the accuracy of the model, but that the codebook size on the mid-level layer is also important. Those results may guide future works on melanoma screening.

The fulltext of the paper is already available on my publications page.

In addition, Michel Fornaciali has created a mini-site with extra information about the paper, including the executables for the method we implemented, and the AUC measure of all 320 runs employed in the statistical analysis.

The dataset employed was kindly provided by the researchers of German project IRMA, hosted at the RWTH Aachen University. We are working with them in order to make all the data publicly available.

Call for Contributions — Symposium of Signal Processing @ UNICAMP

The fifth edition of the University of Campinas Signal Processing Symposium
(SPS-Unicamp) will take place this year in September, 15-17th.

This local symposium, promoted by the research community of São Paulo, is gaining importance as a dynamic, interactive event, that offers young scientists the opportunity to network among themselves and with industrial partners.

The call for contributions is open. SPS-Unicamp welcomes papers and mini-courses proposals in the following areas :

  • Biomedic engineering ;
  • Image and video processing, visualization and computer
    graphics ;
  • Signal processing applied to forensis, biometry and bioinformatics ;
  • Control and automation ;
  • Seismic processing ;
  • Communications ;
  • Signal processing applied to sports science ;
  • Theory of signal processing ;
  • Hardware implementation of signal processing

Papers can be written both in English or Portuguese. Both 4-page short papers and 1-page extended abstracts are accepted. Not only original works with results, but also works in progress, and research-project papers are welcome.

Deadline : August 4th, 2014. 

For more information, please check SPS-Unicamp Homepage.

The IEEE Women in Engineering South Brazil student chapter, hosted at Unicamp, and the IEEE Signal Processing Society São Paulo chapter of support this event.

Diabetic Retinopathy Paper Accepted at IEEE EMBC’14

(This entry was crossposted with minor modifications from my lab’s blog.)

Our cooperative work on Diabetic Retinopathy has produced a new paper, now in the IEEE Engineering in Medicine and Biology Conference ! This new work explores the BossaNova representation — an state-of-the-art extension to the bags-of-words model in the task of Diabetic Retinopathy classification.

Take at look at the abstract :

The biomedical community has shown a continued interest in automated detection of Diabetic Retinopathy (DR), with new imaging techniques, evolving diagnostic criteria, and advancing computing methods. Existing state of the art for detecting DR-related lesions tends to emphasize different, specific approaches for each type of lesion. However, recent research has aimed at general frameworks adaptable for large classes of lesions. In this paper, we follow this latter trend by exploring a very flexible framework, based upon two-tiered feature extraction (low-level and mid-level) from images and Support Vector Machines. The main contribution of this work is the evaluation of BossaNova, a recent and powerful mid-level image characterization technique, which we contrast with previous art based upon classical Bag of Visual Words (BoVW). The new technique using BossaNova achieves a detection performance (measured by area under the curve — AUC) of 96.4% for hard exudates, and 93.5% for red lesions using a cross-dataset training/testing protocol.

ROC curves for hard exudates using class-based codebooks and comparing our previous approach with BoW [11] and our newly proposed technique with BossaNova. The AUCs are shown on the legend.

ROC curves for hard exudates using class-based codebooks and comparing our previous approach with BoW [11] and our newly proposed technique with BossaNova. The AUCs are shown on the legend.

The full-text preprint is available in my publications page. The conference will be held in Chicago, IL, USA from August 26 to 30, 2014.

Continuing our efforts in making our results reproducible. The datasets used in this work are publicly available at FigShare, under the DOI : 10.6084/m9.figshare.953671. The code employed will be released soon.