[next,v2,4/5] support/scripts/pkg-stats-new: add latest upstream version information

Message ID 20180221221342.15683-5-thomas.petazzoni@bootlin.com
State Superseded
Headers show
Series
  • New pkg-stats script, with version information
Related show

Commit Message

Thomas Petazzoni Feb. 21, 2018, 10:13 p.m.
This commit adds fetching the latest upstream version of each package
from release-monitoring.org.

The fetching process first tries to use the package mappings of the
"Buildroot" distribution [1]. If there is no result, then it does a
regular search, and within the search results, looks for a package
whose name matches the Buildroot name.

Since release-monitoring.org is a bit slow, we have 8 threads that
fetch information in parallel.

From an output point of view, the latest version column:

 - Is green when the version in Buildroot matches the latest upstream
   version

 - Is orange when the latest upstream version is unknown because the
   package was not found on release-monitoring.org

 - Is red when the version in Buildroot doesn't match the latest
   upstream version. Note that we are not doing anything smart here:
   we are just testing if the strings are equal or not.

 - The cell contains the link to the project on release-monitoring.org
   if found.

 - The cell indicates if the match was done using a distro mapping, or
   through a regular search.

[1] https://release-monitoring.org/distro/Buildroot/

Signed-off-by: Thomas Petazzoni <thomas.petazzoni@bootlin.com>
---
Changes since v1:
- Fix flake8 warnings
- Add missing newline in HTML
---
 support/scripts/pkg-stats-new | 142 ++++++++++++++++++++++++++++++++++++++++++
 1 file changed, 142 insertions(+)

Comments

Ricardo Martincoski Feb. 28, 2018, 3:03 a.m. | #1
Hello,

This is the last part of my review for v2, as patch 5 is just the rename.

When running the script with a few packages it works well.
But when running the script for all packages, for me it often hangs at the end
(2 out of 5 tries).
It can be related to my setup (python module versions?, internet connection?)
But anyway see later in this email an alternative approach.

When it hangs, by scrolling back I see (in a different place for the each run
that hung):
[snip]
 [1453] python-autobahn => (False, None, None)
Exception in thread Thread-6:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "support/scripts/pkg-stats-new", line 309, in get_version_worker
    pkg.latest_version = get_latest_version(pkg.name)
  File "support/scripts/pkg-stats-new", line 300, in get_latest_version
    return get_latest_version_by_guess(package)
  File "support/scripts/pkg-stats-new", line 282, in get_latest_version_by_guess
    f = urllib2.urlopen(req)
  File "/usr/lib/python2.7/urllib2.py", line 154, in urlopen
    return opener.open(url, data, timeout)
  File "/usr/lib/python2.7/urllib2.py", line 435, in open
    response = meth(req, response)
  File "/usr/lib/python2.7/urllib2.py", line 548, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib/python2.7/urllib2.py", line 467, in error
    result = self._call_chain(*args)
  File "/usr/lib/python2.7/urllib2.py", line 407, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.7/urllib2.py", line 654, in http_error_302
    return self.parent.open(new, timeout=req.timeout)
  File "/usr/lib/python2.7/urllib2.py", line 435, in open
    response = meth(req, response)
  File "/usr/lib/python2.7/urllib2.py", line 548, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib/python2.7/urllib2.py", line 473, in error
    return self._call_chain(*args)
  File "/usr/lib/python2.7/urllib2.py", line 407, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.7/urllib2.py", line 556, in http_error_default
    raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
HTTPError: HTTP Error 404: NOT FOUND

 [1452] cups-filters => (False, u'1.20.1', 5541)
[snip]
 [0001] python-flask-login => (False, None, None)
 [0000] luabitop => (False, None, None)
 [0000] python-pyopenssl => (False, None, None)
 [0000] libsexy => (False, None, None)
 [0000] libnetfilter_conntrack => (False, u'1.0.6', 1675)
 [0000] doxygen => (False, u'1.8.14.windows.x64.bin', 457)
 [0000] vdr-plugin-vnsiserver => (False, None, None)
[hangs here, I waited 10 minutes]


On Wed, Feb 21, 2018 at 07:13 PM, Thomas Petazzoni wrote:

> This commit adds fetching the latest upstream version of each package
> from release-monitoring.org.
> 
> The fetching process first tries to use the package mappings of the
> "Buildroot" distribution [1]. If there is no result, then it does a
> regular search, and within the search results, looks for a package
> whose name matches the Buildroot name.
> 
> Since release-monitoring.org is a bit slow, we have 8 threads that
> fetch information in parallel.

For me:
patches 1 to 3:   1m37.798s
patches 1 to 4:  20m5.513s

Alternative:      3m7.943s

[snip]
> +RELEASE_MONITORING_API = "http://release-monitoring.org/api"
> +
> +
> +def get_latest_version_by_distro(package):
> +    req = urllib2.Request(os.path.join(RELEASE_MONITORING_API, "project", "Buildroot", package))

There is also an undocumented API:
http://release-monitoring.org/api/projects/?distro=Buildroot
It was implemented here:
https://github.com/release-monitoring/anitya/commit/f3d8be75a6643b5d8c55754b74ed4f74f71fe952
and will eventually be documented here:
https://github.com/release-monitoring/anitya/issues/420

When the Buildroot distro becomes 90% filled we could get something like this:
$ wget -O Alpine 'http://release-monitoring.org/api/projects/?distro=Alpine'
... 1MB, 15 seconds ...
$ grep -c '"name"' Alpine
1838

> +    f = urllib2.urlopen(req)
> +    data = json.loads(f.read())
> +    if len(data['versions']) > 0:
> +        return (True, data['versions'][0], data['id'])

Also there is a field 'version' that looks to contain the latest version (or
None if there is none). For the majority of projects it is exactly equal to
'versions'[0].
See the test below, but I didn't find a documentation to corroborate this.

$ wget http://release-monitoring.org/api/projects
... 7MB, 1 minute ...
$ cat check.py 
#!/usr/bin/env python
import json

with open("projects") as f:
    data = json.loads(f.read())
for p in data["projects"]:
    if p['version'] is None:
        continue
    if p['version'] == p['versions'][0]:
        continue
    print p['name'], p['version'], p['versions']
$ ./check.py | wc -l
109
$ grep -c '"name"' projects                                                                                        
16445

> +    else:
> +        return (True, None, data['id'])
[snip]
> +def add_latest_version_info(packages):
> +    """
> +    Fills in the .latest_version field of all Package objects
> +
> +    This field has a special format:
> +      (mapping, version, id)
> +    with:
> +    - mapping: boolean that indicates whether release-monitoring.org
> +      has a mapping for this package name in the Buildroot distribution
> +      or not
> +    - version: string containing the latest version known by
> +      release-monitoring.org for this package
> +    - id: string containing the id of the project corresponding to this
> +      package, as known by release-monitoring.org
> +    """

As an alternative approach we could first download 2 lists:
- all projects for distro (now 10, eventually 2000+)
- all projects (now 16000+)
It could even be saved to a file by one option and then loaded by other option
to speedup the development/maintenance.

But that would require:
The projects in the Buildroot distro to be named exactly as the buildroot
package, i.e. samba4, not samba. Or of course implementing the search by
similarity with regexp.

And be aware: it produces few different guesses (some better, some worst).

Here the sample code without Queue/threading:
    cache_latest_version_by_distro()
    cache_latest_version_by_guess()

    for pkg in packages:
        pkg.latest_version = get_latest_version(pkg.name)
        print " [%04d] %s => %s" % (packages.index(pkg), pkg.name, str(pkg.latest_version))


distro_list = None
guess_list = None


def cache_latest_version_by_distro():
    global distro_list
    req = urllib2.Request(os.path.join(RELEASE_MONITORING_API, "projects", "?distro=Buildroot"))
    f = urllib2.urlopen(req)
    distro_list = json.loads(f.read())


def cache_latest_version_by_guess():
    global guess_list
    req = urllib2.Request(os.path.join(RELEASE_MONITORING_API, "projects"))
    f = urllib2.urlopen(req)
    guess_list = json.loads(f.read())


def get_latest_version(package):
    # We first try by using the "Buildroot" distribution on
    # release-monitoring.org, if it has a mapping for the current
    # package name.
    for data in distro_list["projects"]:
        if data["name"] != package:
            continue
        if len(data['versions']) > 0:
            return (True, data['versions'][0], data['id'])
        else:
            return (True, None, data['id'])
    # If that fails because there is no mapping, we try to search
    # in all packages for a package of this name.
    for p in guess_list['projects']:
        if p['name'] == package and len(p['versions']) > 0:
            return (False, p['versions'][0], p['id'])
    return (False, None, None)

> +    q = Queue()
> +    for pkg in packages:
> +        q.put(pkg)
> +    # Since release-monitoring.org is rather slow, we create 8 threads
> +    # that do HTTP requests to the site.
> +    for i in range(8):
> +        t = Thread(target=get_version_worker, args=[q])
> +        t.daemon = True
> +        t.start()
> +    q.join()
[snip]
> @@ -413,6 +514,34 @@ def dump_html_pkg(f, pkg):
>      # Current version
>      f.write("  <td class=\"centered\">%s</td>\n" % pkg.current_version)
>  
> +    # Latest version
> +    if pkg.latest_version[1] is None:
> +        td_class.append("version-unknown")
> +    elif pkg.latest_version[1] != pkg.current_version:
> +        td_class.append("version-needs-update")
> +    else:
> +        td_class.append("version-good")
> +
> +    if pkg.latest_version[1] is None:
> +        latest_version_text = "<b>Unknown</b>"
> +    else:
> +        latest_version_text = "<b>%s</b>" % str(pkg.latest_version[1])
> +
> +    latest_version_text += "<br/>"
> +
> +    if pkg.latest_version[2]:
> +        latest_version_text += "<a href=\"https://release-monitoring.org/project/%s\">link</a>, " % pkg.latest_version[2]
> +    else:
> +        latest_version_text += "no link, "
> +
> +    if pkg.latest_version[0]:
> +        latest_version_text += "has <a href=\"https://release-monitoring.org/distro/Buildroot/\">mapping</a>"
> +    else:
> +        latest_version_text += "has <a href=\"https://release-monitoring.org/distro/Buildroot/\">no mapping</a>"

If you don't think it gets too ugly, you could invert the text, putting
'mapping'/'no mapping' as the first cell text. It should make sorting by this
column to show all 'mapping' together and 'no mapping' together. I did not
tested this.

The other way (the correct way) is to implement custom sort keys
https://www.kryogenix.org/code/browser/sorttable/#customkeys
to be done later, of course.

> +
> +    f.write("  <td class=\"%s\">%s</td>\n" %
> +            (" ".join(td_class), latest_version_text))
> +
[snip]

Regards,
Ricardo
Thomas Petazzoni March 7, 2018, 10:41 p.m. | #2
Hello,

On Wed, 28 Feb 2018 00:03:53 -0300, Ricardo Martincoski wrote:

> When running the script with a few packages it works well.
> But when running the script for all packages, for me it often hangs at the end
> (2 out of 5 tries).
> It can be related to my setup (python module versions?, internet connection?)
> But anyway see later in this email an alternative approach.

This could probably be solved by adding timeouts, but I find your
solution below interesting and useful.

> There is also an undocumented API:
> http://release-monitoring.org/api/projects/?distro=Buildroot
> It was implemented here:
> https://github.com/release-monitoring/anitya/commit/f3d8be75a6643b5d8c55754b74ed4f74f71fe952
> and will eventually be documented here:
> https://github.com/release-monitoring/anitya/issues/420
> 
> When the Buildroot distro becomes 90% filled we could get something like this:
> $ wget -O Alpine 'http://release-monitoring.org/api/projects/?distro=Alpine'
> ... 1MB, 15 seconds ...
> $ grep -c '"name"' Alpine
> 1838

Right, but it's going to take a while before we reach this :-)

> Also there is a field 'version' that looks to contain the latest version (or
> None if there is none). For the majority of projects it is exactly equal to
> 'versions'[0].
> See the test below, but I didn't find a documentation to corroborate this.

We can probably talk with the maintainers of release-monitoring.org
about this.

> As an alternative approach we could first download 2 lists:
> - all projects for distro (now 10, eventually 2000+)
> - all projects (now 16000+)
> It could even be saved to a file by one option and then loaded by other option
> to speedup the development/maintenance.

Yes, I'll do something like this: download both files to some
~/.release-monitoring-all-packages.cache and
~/.release-monitoring-buildroot-packages.cache, and use them if they
already exist. Then an option such as -r/--reload can force pkg-stats
to re-download the new version.

> But that would require:
> The projects in the Buildroot distro to be named exactly as the buildroot
> package, i.e. samba4, not samba.

Well, that is the whole point of "distros" on release-monitoring.org:
provide a mapping between the package name in the distribution and the
package name on release-monitoring.org.

The packages present in the "Buildroot" distro on
release-monitoring.org were added by me, often to fix a mismatch
between the Buildroot name and the release-monitoring.org name. samba4
(Buildroot) vs. samba (release-monitoring.org) is a good example.

> Or of course implementing the search by
> similarity with regexp.
> 
> And be aware: it produces few different guesses (some better, some worst).

I didn't understand this part. Why would the results be different ?

> > +    if pkg.latest_version[0]:
> > +        latest_version_text += "has <a href=\"https://release-monitoring.org/distro/Buildroot/\">mapping</a>"
> > +    else:
> > +        latest_version_text += "has <a href=\"https://release-monitoring.org/distro/Buildroot/\">no mapping</a>"  
> 
> If you don't think it gets too ugly, you could invert the text, putting
> 'mapping'/'no mapping' as the first cell text. It should make sorting by this
> column to show all 'mapping' together and 'no mapping' together. I did not
> tested this.

I understand the point, but I find it a bit ugly that this information
appears first in this cell. One option is to make this a separate
column, but we already have a lot of columns. Can we handle that later
on ?

Thanks!

Thomas
Ricardo Martincoski March 8, 2018, 9:52 a.m. | #3
Hello,

On Wed, Mar 07, 2018 at 07:41 PM, Thomas Petazzoni wrote:

> On Wed, 28 Feb 2018 00:03:53 -0300, Ricardo Martincoski wrote:
> 
[snip]
>> As an alternative approach we could first download 2 lists:
>> - all projects for distro (now 10, eventually 2000+)
>> - all projects (now 16000+)
>> It could even be saved to a file by one option and then loaded by other option
>> to speedup the development/maintenance.
> 
> Yes, I'll do something like this: download both files to some
> ~/.release-monitoring-all-packages.cache and
> ~/.release-monitoring-buildroot-packages.cache, and use them if they
> already exist. Then an option such as -r/--reload can force pkg-stats
> to re-download the new version.

Sounds good.

[snip]
>> And be aware: it produces few different guesses (some better, some worst).
> 
> I didn't understand this part. Why would the results be different ?

Indeed, it shouldn't.

I am not sure why it does. It only happens for packages that have more than one
project with the exact same name. So we choose one guess over another.

You can check using this:

import json
import os
import subprocess
def print_id(list):
    for i in list['projects']:
        if i['name'] == 'readline':
            print(i['id'])
API = "http://release-monitoring.org/api"
url1 = os.path.join(API, "projects", "?pattern=%s" % 'readline')
url2 = os.path.join(API, "projects")
subprocess.call(['wget', '-O', 'readline', url1])
subprocess.call(['wget', '-O', 'projects', url2])
print_id(json.load(open('readline')))
print_id(json.load(open('projects')))

It gives to me most of times (3 out 4):
913
4173
4173
913
sometimes it gives me (1 out 4):
913
4173
913
4173
It seems to not always return the same order. If I run the above script few
times in a row the order is consistent. If I wait few hours the result can
change.

> 
>> > +    if pkg.latest_version[0]:
>> > +        latest_version_text += "has <a href=\"https://release-monitoring.org/distro/Buildroot/\">mapping</a>"
>> > +    else:
>> > +        latest_version_text += "has <a href=\"https://release-monitoring.org/distro/Buildroot/\">no mapping</a>"  
>> 
>> If you don't think it gets too ugly, you could invert the text, putting
>> 'mapping'/'no mapping' as the first cell text. It should make sorting by this
>> column to show all 'mapping' together and 'no mapping' together. I did not
>> tested this.
> 
> I understand the point, but I find it a bit ugly that this information
> appears first in this cell. One option is to make this a separate
> column, but we already have a lot of columns. Can we handle that later
> on ?

Sure.

Regards,
Ricardo
Thomas Petazzoni March 8, 2018, 9:56 a.m. | #4
Hello,

On Thu, 08 Mar 2018 06:52:49 -0300, Ricardo Martincoski wrote:

> > I didn't understand this part. Why would the results be different ?  
> 
> Indeed, it shouldn't.
> 
> I am not sure why it does. It only happens for packages that have more than one
> project with the exact same name. So we choose one guess over another.
> 
> You can check using this:
> 
> import json
> import os
> import subprocess
> def print_id(list):
>     for i in list['projects']:
>         if i['name'] == 'readline':
>             print(i['id'])
> API = "http://release-monitoring.org/api"
> url1 = os.path.join(API, "projects", "?pattern=%s" % 'readline')
> url2 = os.path.join(API, "projects")
> subprocess.call(['wget', '-O', 'readline', url1])
> subprocess.call(['wget', '-O', 'projects', url2])
> print_id(json.load(open('readline')))
> print_id(json.load(open('projects')))
> 
> It gives to me most of times (3 out 4):
> 913
> 4173
> 4173
> 913
> sometimes it gives me (1 out 4):
> 913
> 4173
> 913
> 4173
> It seems to not always return the same order. If I run the above script few
> times in a row the order is consistent. If I wait few hours the result can
> change.

Perhaps we should report this upstream to the people maintaining
release-monitoring.org ?

Thomas
Ricardo Martincoski March 9, 2018, 2:41 a.m. | #5
Hello,

On Thu, Mar 08, 2018 at 06:56 AM, Thomas Petazzoni wrote:

[snip]
>> It seems to not always return the same order. If I run the above script few
>> times in a row the order is consistent. If I wait few hours the result can
>> change.
> 
> Perhaps we should report this upstream to the people maintaining
> release-monitoring.org ?

I created:
https://github.com/release-monitoring/anitya/issues/533


Also, regarding the 'version' field, I found these lines of code:

class Project(Base):
...
        latest_version (sa.Boolean): The latest version for the project, as determined
            by the version sorting algorithm.
...
    def __json__(self, detailed=False):
...
            version=self.latest_version,

https://github.com/release-monitoring/anitya/blob/a0a9343dd86d9b7d41487b241f8e102d3321deee/anitya/db/models.py#L415
(PS: it's not a boolean of course, the comment is wrong in this aspect)


Regards,
Ricardo
Thomas Petazzoni March 21, 2018, 8:58 p.m. | #6
Hello Ricardo,

On Wed, 28 Feb 2018 00:03:53 -0300, Ricardo Martincoski wrote:

> But that would require:
> The projects in the Buildroot distro to be named exactly as the buildroot
> package, i.e. samba4, not samba. Or of course implementing the search by
> similarity with regexp.
> 
> And be aware: it produces few different guesses (some better, some worst).
> 
> Here the sample code without Queue/threading:
>     cache_latest_version_by_distro()
>     cache_latest_version_by_guess()
> 
>     for pkg in packages:
>         pkg.latest_version = get_latest_version(pkg.name)
>         print " [%04d] %s => %s" % (packages.index(pkg), pkg.name, str(pkg.latest_version))
> 
> 
> distro_list = None
> guess_list = None
> 
> 
> def cache_latest_version_by_distro():
>     global distro_list
>     req = urllib2.Request(os.path.join(RELEASE_MONITORING_API, "projects", "?distro=Buildroot"))
>     f = urllib2.urlopen(req)
>     distro_list = json.loads(f.read())
> 
> 
> def cache_latest_version_by_guess():
>     global guess_list
>     req = urllib2.Request(os.path.join(RELEASE_MONITORING_API, "projects"))
>     f = urllib2.urlopen(req)
>     guess_list = json.loads(f.read())
> 
> 
> def get_latest_version(package):
>     # We first try by using the "Buildroot" distribution on
>     # release-monitoring.org, if it has a mapping for the current
>     # package name.
>     for data in distro_list["projects"]:
>         if data["name"] != package:
>             continue
>         if len(data['versions']) > 0:
>             return (True, data['versions'][0], data['id'])
>         else:
>             return (True, None, data['id'])
>     # If that fails because there is no mapping, we try to search
>     # in all packages for a package of this name.
>     for p in guess_list['projects']:
>         if p['name'] == package and len(p['versions']) > 0:
>             return (False, p['versions'][0], p['id'])
>     return (False, None, None)

I looked more closely into this, and unfortunately, it doesn't work.
Indeed, the data returned by the URL:

  RELEASE_MONITORING_API/projects/?distro=Buildroot

does not take into account the mapping between Buildroot package names
and release-monitoring.org package names, it only contains the
release-monitoring.org package names.

For example, we have a mapping samba4 -> samba, that allows
https://release-monitoring.org/api/project/Buildroot/samba4 to return
the expected result. But you won't find anything named "samba4" in
https://release-monitoring.org/api/projects/?distro=Buildroot. This
makes the entire concept of distro mapping useless.

Therefore, your proposal cannot work with the data that we can
currently download from release-monitoring.org. I have asked on the
#fedora-apps IRC channel about this.

Perhaps we could go with my v2 version on this aspect (i.e make HTTP
requests for each package), and change that later on if a better
solution is found ?

Best regards,

Thomas
Thomas Petazzoni March 21, 2018, 9:35 p.m. | #7
Hello,

On Wed, 28 Feb 2018 00:03:53 -0300, Ricardo Martincoski wrote:

> When it hangs, by scrolling back I see (in a different place for the each run
> that hung):
> [snip]
>  [1453] python-autobahn => (False, None, None)
> Exception in thread Thread-6:
> Traceback (most recent call last):
>   File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
>     self.run()
>   File "/usr/lib/python2.7/threading.py", line 754, in run
>     self.__target(*self.__args, **self.__kwargs)
>   File "support/scripts/pkg-stats-new", line 309, in get_version_worker
>     pkg.latest_version = get_latest_version(pkg.name)
>   File "support/scripts/pkg-stats-new", line 300, in get_latest_version
>     return get_latest_version_by_guess(package)
>   File "support/scripts/pkg-stats-new", line 282, in get_latest_version_by_guess
>     f = urllib2.urlopen(req)
>   File "/usr/lib/python2.7/urllib2.py", line 154, in urlopen
>     return opener.open(url, data, timeout)
>   File "/usr/lib/python2.7/urllib2.py", line 435, in open
>     response = meth(req, response)
>   File "/usr/lib/python2.7/urllib2.py", line 548, in http_response
>     'http', request, response, code, msg, hdrs)
>   File "/usr/lib/python2.7/urllib2.py", line 467, in error
>     result = self._call_chain(*args)
>   File "/usr/lib/python2.7/urllib2.py", line 407, in _call_chain
>     result = func(*args)
>   File "/usr/lib/python2.7/urllib2.py", line 654, in http_error_302
>     return self.parent.open(new, timeout=req.timeout)
>   File "/usr/lib/python2.7/urllib2.py", line 435, in open
>     response = meth(req, response)
>   File "/usr/lib/python2.7/urllib2.py", line 548, in http_response
>     'http', request, response, code, msg, hdrs)
>   File "/usr/lib/python2.7/urllib2.py", line 473, in error
>     return self._call_chain(*args)
>   File "/usr/lib/python2.7/urllib2.py", line 407, in _call_chain
>     result = func(*args)
>   File "/usr/lib/python2.7/urllib2.py", line 556, in http_error_default
>     raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
> HTTPError: HTTP Error 404: NOT FOUND
> 
>  [1452] cups-filters => (False, u'1.20.1', 5541)
> [snip]
>  [0001] python-flask-login => (False, None, None)
>  [0000] luabitop => (False, None, None)
>  [0000] python-pyopenssl => (False, None, None)
>  [0000] libsexy => (False, None, None)
>  [0000] libnetfilter_conntrack => (False, u'1.0.6', 1675)
>  [0000] doxygen => (False, u'1.8.14.windows.x64.bin', 457)
>  [0000] vdr-plugin-vnsiserver => (False, None, None)
> [hangs here, I waited 10 minutes]

I should have fixed this, will be in v3. I'm now handling exceptions in
all cases, and I've added a timeout on the urllib2.urlopen() calls, that
will make it abort after 15 seconds if the HTTP request has not
returned.

This will allow to make sure the script terminates properly. However,
it means that the result of the script may be different from one run to
the other, as the HTTP request for a given package may sometimes take
more than 15 seconds, sometimes not.

I guess this is a good enough trade-off, until upstream provides us a
better way of retrieving the data.

Best regards,

Thomas
Ricardo Martincoski March 22, 2018, 3:11 a.m. | #8
Hello,

On Wed, Mar 21, 2018 at 05:58 PM, Thomas Petazzoni wrote:

> I looked more closely into this, and unfortunately, it doesn't work.
> Indeed, the data returned by the URL:
> 
>   RELEASE_MONITORING_API/projects/?distro=Buildroot
> 
> does not take into account the mapping between Buildroot package names
> and release-monitoring.org package names, it only contains the
> release-monitoring.org package names.

Indeed. I missed it was missing.

> 
> For example, we have a mapping samba4 -> samba, that allows
> https://release-monitoring.org/api/project/Buildroot/samba4 to return
> the expected result. But you won't find anything named "samba4" in
> https://release-monitoring.org/api/projects/?distro=Buildroot. This
> makes the entire concept of distro mapping useless.
> 
> Therefore, your proposal cannot work with the data that we can
> currently download from release-monitoring.org. I have asked on the
> #fedora-apps IRC channel about this.
> 
> Perhaps we could go with my v2 version on this aspect (i.e make HTTP
> requests for each package), and change that later on if a better
> solution is found ?

I dislike the fact we make 2000+ HTTP requests, but as you explained it is a
limitation from the current api exposed by the upstream. I even looked at the
code hoping the mapping could be easily introduced in api v1... but no, it is
not trivial.

So, yes. Let's do this incrementally, first let's make it work using one request
per package, then later we improve the script when the api v2 is deployed.

In the meanwhile, we will be able to start creating the mappings based on the
daily generated html.

Regards,
Ricardo
Ricardo Martincoski March 22, 2018, 3:17 a.m. | #9
Hello,

On Wed, Mar 21, 2018 at 06:35 PM, Thomas Petazzoni wrote:

> On Wed, 28 Feb 2018 00:03:53 -0300, Ricardo Martincoski wrote:
> 
>>  [0000] vdr-plugin-vnsiserver => (False, None, None)
>> [hangs here, I waited 10 minutes]
> 
> I should have fixed this, will be in v3. I'm now handling exceptions in
> all cases, and I've added a timeout on the urllib2.urlopen() calls, that
> will make it abort after 15 seconds if the HTTP request has not
> returned.
> 
> This will allow to make sure the script terminates properly. However,
> it means that the result of the script may be different from one run to
> the other, as the HTTP request for a given package may sometimes take
> more than 15 seconds, sometimes not.

Do you mean the generated html will be different?
Will the script retry for a package that timeouts?
Well... I can wait you send v3 to know this.

> 
> I guess this is a good enough trade-off, until upstream provides us a
> better way of retrieving the data.

The upstream is working to provide api_v2 that will improve various aspects.
The main changes in the api are a new field ecosystem (it will contain pypi for
python packages, upstream url for custom projects, ...) and the paging system in
the web interface.
The new api will also provide the mapping:
url/api/v2/projects?distribution=Buildroot
I tested this by running a local server with a copy of the production database.
Sample output:
...
{
    "distribution": "Buildroot", 
    "ecosystem": "https://download.samba.org/pub/samba/", 
    "name": "samba4", 
    "project": "samba"
}, 
...

Most changes in the code look (to me) ready and the upstream is currently
setting up a staging server.
Api v2 is *not yet* deployed to release-monitoring. I don't know the upstream
timeline for this.

In the meanwhile, the solution you propose seems to be the best we can do using
api v1.

Right now there is a single direct user of the script (the server that
generates the html), so a time penalty is not critical.
All the other users (we) will access the already generated pkg-stats html.


But I think it is better to upgrade to api v2 before we provide this script as a
build target to generate a report based on the packages selected by .config .
I think you mentioned something similar in an e-mail I unfortunately can't find
right now.

Regards,
Ricardo
Thomas Petazzoni March 22, 2018, 7:53 a.m. | #10
Hello Ricardo,

On Thu, 22 Mar 2018 00:11:09 -0300, Ricardo Martincoski wrote:

> I dislike the fact we make 2000+ HTTP requests, but as you explained it is a
> limitation from the current api exposed by the upstream. I even looked at the
> code hoping the mapping could be easily introduced in api v1... but no, it is
> not trivial.

I think it is easy. It should just be a matter of the following patch:

diff --git a/anitya/api.py b/anitya/api.py
index a53be43..5f0be9e 100644
--- a/anitya/api.py
+++ b/anitya/api.py
@@ -156,7 +156,7 @@ def api_projects():
     else:
         project_objs = models.Project.all(Session)
 
-    projects = [project.__json__() for project in project_objs]
+    projects = [project.__json__(detailed=True) for project in project_objs]
 
     output = {
         'total': len(projects),

I've asked upstream if they were interested, and the reply I got this
morning is:

08:48 < pingou> kos_tom: sounds fine to me, but I know that jcline has been working on a v2 api, so maybe that'll help
08:49 < kos_tom> pingou: do you think you can apply the patch I provided ?
08:49 < kos_tom> pingou: alternatively, we could make getting the detailed information optional
08:49 < pingou> kos_tom: jcline is now the one maintaining it
08:52 < kos_tom> pingou: ok :)

> So, yes. Let's do this incrementally, first let's make it work using one request
> per package, then later we improve the script when the api v2 is deployed.

Right.

> In the meanwhile, we will be able to start creating the mappings based on the
> daily generated html.

Indeed.

Best regards,

Thomas
Thomas Petazzoni March 22, 2018, 10:01 a.m. | #11
Hello,

On Thu, 22 Mar 2018 00:17:26 -0300, Ricardo Martincoski wrote:

> > This will allow to make sure the script terminates properly. However,
> > it means that the result of the script may be different from one run to
> > the other, as the HTTP request for a given package may sometimes take
> > more than 15 seconds, sometimes not.  
> 
> Do you mean the generated html will be different?

Yes, the HTML would be different: if for a first run, the request for
package "foo" timeouts the HTML will say "unknown upstream version".
Then for a second run of the script, the request for package "foo"
doesn't timeout (release-monitoring.org was faster), then the HTML will
have a proper value for the upstream version.

> Will the script retry for a package that timeouts?

So far it doesn't.

> The upstream is working to provide api_v2 that will improve various aspects.
> The main changes in the api are a new field ecosystem (it will contain pypi for
> python packages, upstream url for custom projects, ...) and the paging system in
> the web interface.
> The new api will also provide the mapping:
> url/api/v2/projects?distribution=Buildroot
> I tested this by running a local server with a copy of the production database.
> Sample output:
> ...
> {
>     "distribution": "Buildroot", 
>     "ecosystem": "https://download.samba.org/pub/samba/", 
>     "name": "samba4", 
>     "project": "samba"
> }, 
> ...
> 
> Most changes in the code look (to me) ready and the upstream is currently
> setting up a staging server.
> Api v2 is *not yet* deployed to release-monitoring. I don't know the upstream
> timeline for this.
> 
> In the meanwhile, the solution you propose seems to be the best we can do using
> api v1.
> 
> Right now there is a single direct user of the script (the server that
> generates the html), so a time penalty is not critical.
> All the other users (we) will access the already generated pkg-stats html.

Yes, that was the idea.

> But I think it is better to upgrade to api v2 before we provide this script as a
> build target to generate a report based on the packages selected by .config .
> I think you mentioned something similar in an e-mail I unfortunately can't find
> right now.

Indeed, when the v2 API is available, we will definitely revisit this
and use pre-downloaded data.

Best regards,

Thomas

Patch

diff --git a/support/scripts/pkg-stats-new b/support/scripts/pkg-stats-new
index c4174877aa..b27f4df9a8 100755
--- a/support/scripts/pkg-stats-new
+++ b/support/scripts/pkg-stats-new
@@ -24,6 +24,10 @@  from collections import defaultdict
 import re
 import subprocess
 import sys
+import json
+import urllib2
+from Queue import Queue
+from threading import Thread
 
 
 class Package:
@@ -37,6 +41,7 @@  class Package:
         self.patch_count = 0
         self.warnings = 0
         self.current_version = None
+        self.latest_version = None
 
     def __eq__(self, other):
         return self.path == other.path
@@ -259,6 +264,83 @@  def add_check_package_warnings(packages):
         pkg.warnings = get_check_package_warnings(os.path.dirname(pkg.path))
 
 
+RELEASE_MONITORING_API = "http://release-monitoring.org/api"
+
+
+def get_latest_version_by_distro(package):
+    req = urllib2.Request(os.path.join(RELEASE_MONITORING_API, "project", "Buildroot", package))
+    f = urllib2.urlopen(req)
+    data = json.loads(f.read())
+    if len(data['versions']) > 0:
+        return (True, data['versions'][0], data['id'])
+    else:
+        return (True, None, data['id'])
+
+
+def get_latest_version_by_guess(package):
+    req = urllib2.Request(os.path.join(RELEASE_MONITORING_API, "projects", "?pattern=%s" % package))
+    f = urllib2.urlopen(req)
+    data = json.loads(f.read())
+    for p in data['projects']:
+        if p['name'] == package and len(p['versions']) > 0:
+            return (False, p['versions'][0], p['id'])
+    return (False, None, None)
+
+
+def get_latest_version(package):
+    try:
+        # We first try by using the "Buildroot" distribution on
+        # release-monitoring.org, if it has a mapping for the current
+        # package name.
+        return get_latest_version_by_distro(package)
+    except urllib2.HTTPError, e:
+        # If that fails because there is no mapping, we try to search
+        # in all packages for a package of this name.
+        if e.code == 404:
+            return get_latest_version_by_guess(package)
+        else:
+            return (False, None, None)
+
+
+def get_version_worker(q):
+    while True:
+        pkg = q.get()
+        try:
+            pkg.latest_version = get_latest_version(pkg.name)
+            print " [%04d] %s => %s" % (q.qsize(), pkg.name, str(pkg.latest_version))
+        except ValueError:
+            pkg.latest_version = (False, None, None)
+            print " [%04d] %s => Value Error" % (q.qsize(), pkg.name)
+        q.task_done()
+
+
+def add_latest_version_info(packages):
+    """
+    Fills in the .latest_version field of all Package objects
+
+    This field has a special format:
+      (mapping, version, id)
+    with:
+    - mapping: boolean that indicates whether release-monitoring.org
+      has a mapping for this package name in the Buildroot distribution
+      or not
+    - version: string containing the latest version known by
+      release-monitoring.org for this package
+    - id: string containing the id of the project corresponding to this
+      package, as known by release-monitoring.org
+    """
+    q = Queue()
+    for pkg in packages:
+        q.put(pkg)
+    # Since release-monitoring.org is rather slow, we create 8 threads
+    # that do HTTP requests to the site.
+    for i in range(8):
+        t = Thread(target=get_version_worker, args=[q])
+        t.daemon = True
+        t.start()
+    q.join()
+
+
 def calculate_stats(packages):
     stats = defaultdict(int)
     for pkg in packages:
@@ -283,6 +365,16 @@  def calculate_stats(packages):
             stats["hash"] += 1
         else:
             stats["no-hash"] += 1
+        if pkg.latest_version[0]:
+            stats["rmo-mapping"] += 1
+        else:
+            stats["rmo-no-mapping"] += 1
+        if not pkg.latest_version[1]:
+            stats["version-unknown"] += 1
+        elif pkg.latest_version[1] == pkg.current_version:
+            stats["version-uptodate"] += 1
+        else:
+            stats["version-not-uptodate"] += 1
         stats["patches"] += pkg.patch_count
     return stats
 
@@ -315,6 +407,15 @@  td.somepatches {
 td.lotsofpatches {
   background: #ff9a69;
 }
+td.version-good {
+  background: #d2ffc4;
+}
+td.version-needs-update {
+  background: #ff9a69;
+}
+td.version-unknown {
+ background: #ffd870;
+}
 </style>
 <title>Statistics of Buildroot packages</title>
 </head>
@@ -413,6 +514,34 @@  def dump_html_pkg(f, pkg):
     # Current version
     f.write("  <td class=\"centered\">%s</td>\n" % pkg.current_version)
 
+    # Latest version
+    if pkg.latest_version[1] is None:
+        td_class.append("version-unknown")
+    elif pkg.latest_version[1] != pkg.current_version:
+        td_class.append("version-needs-update")
+    else:
+        td_class.append("version-good")
+
+    if pkg.latest_version[1] is None:
+        latest_version_text = "<b>Unknown</b>"
+    else:
+        latest_version_text = "<b>%s</b>" % str(pkg.latest_version[1])
+
+    latest_version_text += "<br/>"
+
+    if pkg.latest_version[2]:
+        latest_version_text += "<a href=\"https://release-monitoring.org/project/%s\">link</a>, " % pkg.latest_version[2]
+    else:
+        latest_version_text += "no link, "
+
+    if pkg.latest_version[0]:
+        latest_version_text += "has <a href=\"https://release-monitoring.org/distro/Buildroot/\">mapping</a>"
+    else:
+        latest_version_text += "has <a href=\"https://release-monitoring.org/distro/Buildroot/\">no mapping</a>"
+
+    f.write("  <td class=\"%s\">%s</td>\n" %
+            (" ".join(td_class), latest_version_text))
+
     # Warnings
     td_class = ["centered"]
     if pkg.warnings == 0:
@@ -436,6 +565,7 @@  def dump_html_all_pkgs(f, packages):
 <td class=\"centered\">License files</td>
 <td class=\"centered\">Hash file</td>
 <td class=\"centered\">Current version</td>
+<td class=\"centered\">Latest version</td>
 <td class=\"centered\">Warnings</td>
 </tr>
 """)
@@ -465,6 +595,16 @@  def dump_html_stats(f, stats):
             stats["no-hash"])
     f.write(" <tr><td>Total number of patches</td><td>%s</td></tr>\n" %
             stats["patches"])
+    f.write("<tr><td>Packages having a mapping on <i>release-monitoring.org</i></td><td>%s</td></tr>\n" %
+            stats["rmo-mapping"])
+    f.write("<tr><td>Packages lacking a mapping on <i>release-monitoring.org</i></td><td>%s</td></tr>\n" %
+            stats["rmo-no-mapping"])
+    f.write("<tr><td>Packages that are up-to-date</td><td>%s</td></tr>\n" %
+            stats["version-uptodate"])
+    f.write("<tr><td>Packages that are not up-to-date</td><td>%s</td></tr>\n" %
+            stats["version-not-uptodate"])
+    f.write("<tr><td>Packages with no known upstream version</td><td>%s</td></tr>\n" %
+            stats["version-unknown"])
     f.write("</table>\n")
 
 
@@ -517,6 +657,8 @@  def __main__():
     add_patch_count(packages)
     print "Get package warnings ..."
     add_check_package_warnings(packages)
+    print "Get latest version ..."
+    add_latest_version_info(packages)
     print "Calculate stats"
     stats = calculate_stats(packages)
     print "Write HTML"